Sep 29 18:43:18 crc systemd[1]: Starting Kubernetes Kubelet... Sep 29 18:43:19 crc restorecon[4743]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 29 18:43:19 crc restorecon[4743]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 29 18:43:20 crc kubenswrapper[4780]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 18:43:20 crc kubenswrapper[4780]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 29 18:43:20 crc kubenswrapper[4780]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 18:43:20 crc kubenswrapper[4780]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 18:43:20 crc kubenswrapper[4780]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 29 18:43:20 crc kubenswrapper[4780]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.495369 4780 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502726 4780 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502760 4780 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502770 4780 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502781 4780 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502792 4780 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502803 4780 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502815 4780 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502824 4780 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502834 4780 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502843 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502851 4780 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502859 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502867 4780 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502875 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502883 4780 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502891 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502899 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502906 4780 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502917 4780 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502928 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502936 4780 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502945 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502953 4780 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502961 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502970 4780 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502978 4780 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.502998 4780 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503007 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503015 4780 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503023 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503031 4780 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503039 4780 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503073 4780 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503081 4780 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503089 4780 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503101 4780 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503111 4780 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503121 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503131 4780 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503140 4780 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503148 4780 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503156 4780 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503164 4780 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503172 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503179 4780 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503188 4780 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503196 4780 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503205 4780 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503244 4780 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503255 4780 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503265 4780 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503274 4780 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503283 4780 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503292 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503301 4780 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503309 4780 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503316 4780 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503325 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503332 4780 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503342 4780 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503350 4780 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503358 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503366 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503374 4780 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503384 4780 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503392 4780 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503400 4780 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503408 4780 feature_gate.go:330] unrecognized feature gate: Example Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503416 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503447 4780 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.503459 4780 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504490 4780 flags.go:64] FLAG: --address="0.0.0.0" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504525 4780 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504550 4780 flags.go:64] FLAG: --anonymous-auth="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504564 4780 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504578 4780 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504588 4780 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504612 4780 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504624 4780 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504633 4780 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504643 4780 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504653 4780 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504663 4780 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504672 4780 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504681 4780 flags.go:64] FLAG: --cgroup-root="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504690 4780 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504699 4780 flags.go:64] FLAG: --client-ca-file="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504708 4780 flags.go:64] FLAG: --cloud-config="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504716 4780 flags.go:64] FLAG: --cloud-provider="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504725 4780 flags.go:64] FLAG: --cluster-dns="[]" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504736 4780 flags.go:64] FLAG: --cluster-domain="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504744 4780 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504754 4780 flags.go:64] FLAG: --config-dir="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504763 4780 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504772 4780 flags.go:64] FLAG: --container-log-max-files="5" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504784 4780 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504793 4780 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504802 4780 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504811 4780 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504820 4780 flags.go:64] FLAG: --contention-profiling="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504829 4780 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504837 4780 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504848 4780 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504857 4780 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504868 4780 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504877 4780 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504886 4780 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504896 4780 flags.go:64] FLAG: --enable-load-reader="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504904 4780 flags.go:64] FLAG: --enable-server="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504913 4780 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504937 4780 flags.go:64] FLAG: --event-burst="100" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504948 4780 flags.go:64] FLAG: --event-qps="50" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504958 4780 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504968 4780 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504977 4780 flags.go:64] FLAG: --eviction-hard="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504988 4780 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.504996 4780 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505005 4780 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505014 4780 flags.go:64] FLAG: --eviction-soft="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505023 4780 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505032 4780 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505041 4780 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505074 4780 flags.go:64] FLAG: --experimental-mounter-path="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505084 4780 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505093 4780 flags.go:64] FLAG: --fail-swap-on="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505101 4780 flags.go:64] FLAG: --feature-gates="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505112 4780 flags.go:64] FLAG: --file-check-frequency="20s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505123 4780 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505135 4780 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505146 4780 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505158 4780 flags.go:64] FLAG: --healthz-port="10248" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505169 4780 flags.go:64] FLAG: --help="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505181 4780 flags.go:64] FLAG: --hostname-override="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505191 4780 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505203 4780 flags.go:64] FLAG: --http-check-frequency="20s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505214 4780 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505225 4780 flags.go:64] FLAG: --image-credential-provider-config="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505236 4780 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505247 4780 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505258 4780 flags.go:64] FLAG: --image-service-endpoint="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505269 4780 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505280 4780 flags.go:64] FLAG: --kube-api-burst="100" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505291 4780 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505302 4780 flags.go:64] FLAG: --kube-api-qps="50" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505313 4780 flags.go:64] FLAG: --kube-reserved="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505324 4780 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505335 4780 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505349 4780 flags.go:64] FLAG: --kubelet-cgroups="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505359 4780 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505371 4780 flags.go:64] FLAG: --lock-file="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505381 4780 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505392 4780 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505405 4780 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505422 4780 flags.go:64] FLAG: --log-json-split-stream="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505434 4780 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505445 4780 flags.go:64] FLAG: --log-text-split-stream="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505456 4780 flags.go:64] FLAG: --logging-format="text" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505468 4780 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505480 4780 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505490 4780 flags.go:64] FLAG: --manifest-url="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505503 4780 flags.go:64] FLAG: --manifest-url-header="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505519 4780 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505531 4780 flags.go:64] FLAG: --max-open-files="1000000" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505547 4780 flags.go:64] FLAG: --max-pods="110" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505558 4780 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505570 4780 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505585 4780 flags.go:64] FLAG: --memory-manager-policy="None" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505596 4780 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505609 4780 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505620 4780 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505633 4780 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505656 4780 flags.go:64] FLAG: --node-status-max-images="50" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505665 4780 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505714 4780 flags.go:64] FLAG: --oom-score-adj="-999" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505724 4780 flags.go:64] FLAG: --pod-cidr="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505734 4780 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505753 4780 flags.go:64] FLAG: --pod-manifest-path="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505764 4780 flags.go:64] FLAG: --pod-max-pids="-1" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505776 4780 flags.go:64] FLAG: --pods-per-core="0" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505787 4780 flags.go:64] FLAG: --port="10250" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505800 4780 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505811 4780 flags.go:64] FLAG: --provider-id="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505823 4780 flags.go:64] FLAG: --qos-reserved="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505834 4780 flags.go:64] FLAG: --read-only-port="10255" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505844 4780 flags.go:64] FLAG: --register-node="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505853 4780 flags.go:64] FLAG: --register-schedulable="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505862 4780 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505878 4780 flags.go:64] FLAG: --registry-burst="10" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505887 4780 flags.go:64] FLAG: --registry-qps="5" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505897 4780 flags.go:64] FLAG: --reserved-cpus="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505906 4780 flags.go:64] FLAG: --reserved-memory="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505917 4780 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505926 4780 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505935 4780 flags.go:64] FLAG: --rotate-certificates="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505944 4780 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505953 4780 flags.go:64] FLAG: --runonce="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505962 4780 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505972 4780 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505981 4780 flags.go:64] FLAG: --seccomp-default="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.505991 4780 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506000 4780 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506010 4780 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506019 4780 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506028 4780 flags.go:64] FLAG: --storage-driver-password="root" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506076 4780 flags.go:64] FLAG: --storage-driver-secure="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506086 4780 flags.go:64] FLAG: --storage-driver-table="stats" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506096 4780 flags.go:64] FLAG: --storage-driver-user="root" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506105 4780 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506114 4780 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506124 4780 flags.go:64] FLAG: --system-cgroups="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506133 4780 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506147 4780 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506156 4780 flags.go:64] FLAG: --tls-cert-file="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506165 4780 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506176 4780 flags.go:64] FLAG: --tls-min-version="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506185 4780 flags.go:64] FLAG: --tls-private-key-file="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506194 4780 flags.go:64] FLAG: --topology-manager-policy="none" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506203 4780 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506212 4780 flags.go:64] FLAG: --topology-manager-scope="container" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506222 4780 flags.go:64] FLAG: --v="2" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506235 4780 flags.go:64] FLAG: --version="false" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506246 4780 flags.go:64] FLAG: --vmodule="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506257 4780 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.506267 4780 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506547 4780 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506560 4780 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506568 4780 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506577 4780 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506585 4780 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506593 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506601 4780 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506609 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506621 4780 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506631 4780 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506640 4780 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506648 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506659 4780 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506667 4780 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506676 4780 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506684 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506692 4780 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506700 4780 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506708 4780 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506716 4780 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506724 4780 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506732 4780 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506739 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506747 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506756 4780 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506763 4780 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506771 4780 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506779 4780 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506787 4780 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506796 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506804 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506813 4780 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506821 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506832 4780 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506843 4780 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506852 4780 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506860 4780 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506869 4780 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506878 4780 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506886 4780 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506894 4780 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506902 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506909 4780 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506917 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506928 4780 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506936 4780 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506944 4780 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506955 4780 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506964 4780 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506974 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506983 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506991 4780 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.506999 4780 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507009 4780 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507020 4780 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507029 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507037 4780 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507073 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507081 4780 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507089 4780 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507097 4780 feature_gate.go:330] unrecognized feature gate: Example Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507106 4780 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507113 4780 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507121 4780 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507129 4780 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507137 4780 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507145 4780 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507154 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507162 4780 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507169 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.507177 4780 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.507203 4780 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.519087 4780 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.519128 4780 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519245 4780 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519259 4780 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519269 4780 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519278 4780 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519287 4780 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519296 4780 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519304 4780 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519313 4780 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519322 4780 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519331 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519338 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519346 4780 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519354 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519362 4780 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519370 4780 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519378 4780 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519386 4780 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519394 4780 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519402 4780 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519410 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519417 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519425 4780 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519435 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519443 4780 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519451 4780 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519459 4780 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519470 4780 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519481 4780 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519491 4780 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519501 4780 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519509 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519517 4780 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519527 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519535 4780 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519543 4780 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519552 4780 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519560 4780 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519572 4780 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519582 4780 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519590 4780 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519599 4780 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519606 4780 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519614 4780 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519622 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519629 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519637 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519645 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519653 4780 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519661 4780 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519669 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519677 4780 feature_gate.go:330] unrecognized feature gate: Example Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519686 4780 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519695 4780 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519709 4780 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519732 4780 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519746 4780 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519757 4780 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519768 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519784 4780 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519795 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519804 4780 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519814 4780 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519824 4780 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519836 4780 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519847 4780 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519857 4780 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519867 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519877 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519886 4780 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519896 4780 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.519905 4780 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.519923 4780 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520187 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520200 4780 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520209 4780 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520218 4780 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520227 4780 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520235 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520243 4780 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520253 4780 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520261 4780 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520270 4780 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520277 4780 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520285 4780 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520293 4780 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520301 4780 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520310 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520317 4780 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520325 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520335 4780 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520347 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520356 4780 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520364 4780 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520372 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520383 4780 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520393 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520401 4780 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520409 4780 feature_gate.go:330] unrecognized feature gate: Example Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520417 4780 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520426 4780 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520434 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520441 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520450 4780 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520457 4780 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520465 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520473 4780 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520480 4780 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520488 4780 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520496 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520504 4780 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520511 4780 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520519 4780 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520527 4780 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520535 4780 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520543 4780 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520550 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520558 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520568 4780 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520578 4780 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520588 4780 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520596 4780 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520606 4780 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520616 4780 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520625 4780 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520634 4780 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520642 4780 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520651 4780 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520659 4780 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520668 4780 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520676 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520684 4780 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520693 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520703 4780 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520713 4780 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520722 4780 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520730 4780 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520738 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520746 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520755 4780 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520763 4780 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520771 4780 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520779 4780 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.520787 4780 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.520800 4780 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.522704 4780 server.go:940] "Client rotation is on, will bootstrap in background" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.529171 4780 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.529366 4780 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.531448 4780 server.go:997] "Starting client certificate rotation" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.531501 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.534012 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-25 03:03:29.730406345 +0000 UTC Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.534270 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1352h20m9.196145843s for next certificate rotation Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.559514 4780 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.562371 4780 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.586602 4780 log.go:25] "Validated CRI v1 runtime API" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.624900 4780 log.go:25] "Validated CRI v1 image API" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.627606 4780 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.635870 4780 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-29-18-38-57-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.635935 4780 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.655623 4780 manager.go:217] Machine: {Timestamp:2025-09-29 18:43:20.651372014 +0000 UTC m=+0.599670078 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7e834951-590e-482e-8249-2efa8589f762 BootID:bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ee:cf:93 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ee:cf:93 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:3a:cb:23 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:61:b4:18 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:10:87:f7 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:11:dd:56 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:54:01:fc Speed:-1 Mtu:1496} {Name:eth10 MacAddress:12:71:1d:43:e0:57 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9a:f8:2d:d9:e3:bf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.655929 4780 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.656104 4780 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.656648 4780 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.656976 4780 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.657036 4780 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.657401 4780 topology_manager.go:138] "Creating topology manager with none policy" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.657419 4780 container_manager_linux.go:303] "Creating device plugin manager" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.658187 4780 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.658246 4780 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.658551 4780 state_mem.go:36] "Initialized new in-memory state store" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.658722 4780 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.662873 4780 kubelet.go:418] "Attempting to sync node with API server" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.662909 4780 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.662950 4780 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.662972 4780 kubelet.go:324] "Adding apiserver pod source" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.662998 4780 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.668436 4780 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.670403 4780 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.672962 4780 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.672957 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.673160 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.672912 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.673546 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674745 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674776 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674786 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674795 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674813 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674823 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674833 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674847 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674859 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674870 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674883 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674892 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.674921 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.675427 4780 server.go:1280] "Started kubelet" Sep 29 18:43:20 crc systemd[1]: Started Kubernetes Kubelet. Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.682538 4780 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.683457 4780 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.683840 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.684328 4780 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.688607 4780 server.go:460] "Adding debug handlers to kubelet server" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.690814 4780 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.690884 4780 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.690937 4780 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 03:29:47.845899979 +0000 UTC Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.691149 4780 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1472h46m27.154761502s for next certificate rotation Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.693030 4780 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.693269 4780 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.693443 4780 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.693815 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.694150 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.694248 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.693753 4780 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869d511844895a5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-29 18:43:20.675399077 +0000 UTC m=+0.623697131,LastTimestamp:2025-09-29 18:43:20.675399077 +0000 UTC m=+0.623697131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.695226 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="200ms" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.699884 4780 factory.go:55] Registering systemd factory Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.700015 4780 factory.go:221] Registration of the systemd container factory successfully Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.701609 4780 factory.go:153] Registering CRI-O factory Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.701762 4780 factory.go:221] Registration of the crio container factory successfully Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.701970 4780 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.702108 4780 factory.go:103] Registering Raw factory Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.702213 4780 manager.go:1196] Started watching for new ooms in manager Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.703370 4780 manager.go:319] Starting recovery of all containers Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708751 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708826 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708838 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708846 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708857 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708866 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708875 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708885 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708896 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708908 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708920 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708932 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708943 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708970 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708983 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.708993 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709001 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709012 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709020 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709029 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709039 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709063 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709074 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709084 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709094 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709104 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709116 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709126 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709137 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709169 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709181 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709209 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709218 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709228 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709237 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709246 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709255 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709264 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709273 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709302 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709312 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709321 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709329 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709374 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709383 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709392 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709402 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709411 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709421 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709431 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709439 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709449 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709484 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709493 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709502 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709513 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709523 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709532 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709543 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709552 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709561 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709569 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709578 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709587 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709597 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709606 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709615 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709624 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709632 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709640 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709649 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709658 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709668 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709676 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709685 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709694 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709704 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709712 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709721 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709731 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709740 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709750 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709759 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709769 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709813 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709826 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709841 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709853 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709866 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709876 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709885 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709895 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709904 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709916 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709924 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709933 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709967 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709979 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709988 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.709998 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710008 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710016 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710024 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710034 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710062 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710074 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710084 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710094 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710103 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710113 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710124 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710137 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710146 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710157 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710167 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710176 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710186 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710194 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710203 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710212 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710225 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710241 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710253 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710266 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710276 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710286 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710296 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710306 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710314 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710323 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710331 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710341 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710349 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710358 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710367 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710377 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710386 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710396 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710405 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710414 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710425 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710434 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710443 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710452 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710461 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710471 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710479 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710487 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710508 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710518 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710527 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710536 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710546 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710558 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710566 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710576 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710586 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710596 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710605 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710616 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710625 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710634 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710645 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710656 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710666 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710676 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710686 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710696 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710706 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710715 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710724 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710734 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710744 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710753 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710762 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710772 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710782 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710805 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710816 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710825 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710833 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710842 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710851 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710859 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710870 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710880 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710888 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710898 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710907 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710914 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710923 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710931 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710941 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710949 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710958 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710967 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710976 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710985 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.710997 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714570 4780 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714607 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714623 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714643 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714662 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714676 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714689 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714702 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714715 4780 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714727 4780 reconstruct.go:97] "Volume reconstruction finished" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.714735 4780 reconciler.go:26] "Reconciler: start to sync state" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.744297 4780 manager.go:324] Recovery completed Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.749603 4780 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.751566 4780 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.751671 4780 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.751763 4780 kubelet.go:2335] "Starting kubelet main sync loop" Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.752085 4780 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 29 18:43:20 crc kubenswrapper[4780]: W0929 18:43:20.753877 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.753982 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.762707 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.764558 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.764595 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.764608 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.766567 4780 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.766584 4780 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.766604 4780 state_mem.go:36] "Initialized new in-memory state store" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.783717 4780 policy_none.go:49] "None policy: Start" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.784768 4780 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.784795 4780 state_mem.go:35] "Initializing new in-memory state store" Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.794631 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.845782 4780 manager.go:334] "Starting Device Plugin manager" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.845884 4780 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.845901 4780 server.go:79] "Starting device plugin registration server" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.846527 4780 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.846551 4780 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.847178 4780 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.847274 4780 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.847297 4780 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.852705 4780 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.852839 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.854285 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.854325 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.854338 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.854686 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.855905 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.855973 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.856170 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.856203 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.856215 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.856366 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.856826 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.856886 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857042 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857093 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857106 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857227 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857261 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857287 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857305 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857434 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857459 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857656 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857673 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857682 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857899 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857936 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.857947 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858004 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858024 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858030 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858034 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858144 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858179 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858633 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858656 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858665 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858791 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858817 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.858994 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.859020 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.859059 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.859333 4780 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.859516 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.859540 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.859551 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.900434 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="400ms" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917320 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917369 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917399 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917422 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917446 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917471 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917555 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917617 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917644 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917689 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917843 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.917974 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.918001 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.918025 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.918070 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.946899 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.948726 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.948786 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.948801 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:20 crc kubenswrapper[4780]: I0929 18:43:20.948839 4780 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 18:43:20 crc kubenswrapper[4780]: E0929 18:43:20.949509 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019382 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019466 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019492 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019536 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019556 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019578 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019628 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019623 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019733 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019744 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019653 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019810 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019787 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019737 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019816 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019836 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019842 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019728 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019954 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.019987 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020020 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020086 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020100 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020106 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020146 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020110 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020225 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020150 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020294 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.020413 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.150164 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.152004 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.152073 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.152083 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.152113 4780 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 18:43:21 crc kubenswrapper[4780]: E0929 18:43:21.152750 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.195638 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.205310 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.223798 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.243019 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: W0929 18:43:21.244109 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4b26f8c11a6e461dedaa34a4f456db79caef4bd785ba36952e2d59024ec8d4a6 WatchSource:0}: Error finding container 4b26f8c11a6e461dedaa34a4f456db79caef4bd785ba36952e2d59024ec8d4a6: Status 404 returned error can't find the container with id 4b26f8c11a6e461dedaa34a4f456db79caef4bd785ba36952e2d59024ec8d4a6 Sep 29 18:43:21 crc kubenswrapper[4780]: W0929 18:43:21.246720 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-4e3e5e65b8535ba86f784c81fd71e80c02356218d5389cb221b9ad65b7a36a61 WatchSource:0}: Error finding container 4e3e5e65b8535ba86f784c81fd71e80c02356218d5389cb221b9ad65b7a36a61: Status 404 returned error can't find the container with id 4e3e5e65b8535ba86f784c81fd71e80c02356218d5389cb221b9ad65b7a36a61 Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.247538 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:43:21 crc kubenswrapper[4780]: W0929 18:43:21.252452 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-1791af96ceb530b900814d289e256ea05ba25335278e1efaaabbd287ac8b3495 WatchSource:0}: Error finding container 1791af96ceb530b900814d289e256ea05ba25335278e1efaaabbd287ac8b3495: Status 404 returned error can't find the container with id 1791af96ceb530b900814d289e256ea05ba25335278e1efaaabbd287ac8b3495 Sep 29 18:43:21 crc kubenswrapper[4780]: W0929 18:43:21.267281 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-15139a327c3fa7364a188bfe091ad878f850f48e15ca9363f841a93413cb5cd9 WatchSource:0}: Error finding container 15139a327c3fa7364a188bfe091ad878f850f48e15ca9363f841a93413cb5cd9: Status 404 returned error can't find the container with id 15139a327c3fa7364a188bfe091ad878f850f48e15ca9363f841a93413cb5cd9 Sep 29 18:43:21 crc kubenswrapper[4780]: E0929 18:43:21.301680 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="800ms" Sep 29 18:43:21 crc kubenswrapper[4780]: W0929 18:43:21.548303 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:21 crc kubenswrapper[4780]: E0929 18:43:21.548413 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.553861 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.555741 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.555786 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.555798 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.555834 4780 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 18:43:21 crc kubenswrapper[4780]: E0929 18:43:21.556173 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.685680 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.757260 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4e3e5e65b8535ba86f784c81fd71e80c02356218d5389cb221b9ad65b7a36a61"} Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.758472 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"15139a327c3fa7364a188bfe091ad878f850f48e15ca9363f841a93413cb5cd9"} Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.762762 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fc812d260381e02897d8565dfa63c1be518d11c4b81302c6dd2aa770d1be322c"} Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.764183 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1791af96ceb530b900814d289e256ea05ba25335278e1efaaabbd287ac8b3495"} Sep 29 18:43:21 crc kubenswrapper[4780]: I0929 18:43:21.766231 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4b26f8c11a6e461dedaa34a4f456db79caef4bd785ba36952e2d59024ec8d4a6"} Sep 29 18:43:21 crc kubenswrapper[4780]: W0929 18:43:21.906293 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:21 crc kubenswrapper[4780]: E0929 18:43:21.906743 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Sep 29 18:43:21 crc kubenswrapper[4780]: W0929 18:43:21.964999 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:21 crc kubenswrapper[4780]: E0929 18:43:21.965168 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Sep 29 18:43:22 crc kubenswrapper[4780]: E0929 18:43:22.103033 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="1.6s" Sep 29 18:43:22 crc kubenswrapper[4780]: W0929 18:43:22.244463 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:22 crc kubenswrapper[4780]: E0929 18:43:22.244666 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.356742 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.358692 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.358777 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.358795 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.358861 4780 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 18:43:22 crc kubenswrapper[4780]: E0929 18:43:22.359728 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.685487 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.773661 4780 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e" exitCode=0 Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.773745 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e"} Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.773858 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.775541 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.775594 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.775615 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.777108 4780 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445" exitCode=0 Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.777211 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445"} Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.777252 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.778617 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.778758 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.778801 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.778820 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.780031 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.780105 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.780131 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.780667 4780 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22" exitCode=0 Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.780738 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22"} Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.780811 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.782544 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.782580 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.782595 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.784439 4780 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451" exitCode=0 Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.784495 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451"} Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.784614 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.791630 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.791694 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.791722 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.795238 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45"} Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.795304 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6"} Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.795334 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b"} Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.795361 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c"} Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.795490 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.797022 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.797079 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:22 crc kubenswrapper[4780]: I0929 18:43:22.797090 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.685083 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:23 crc kubenswrapper[4780]: E0929 18:43:23.703837 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="3.2s" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.807339 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70"} Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.807424 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f"} Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.807442 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099"} Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.807453 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66"} Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.809537 4780 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f" exitCode=0 Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.809625 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f"} Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.809683 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.810845 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.810880 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.810891 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.811556 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dbdf88051c4c7ab9f12fffc7dabfbc1ee611f25683d84bd4969b8e2075ec9663"} Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.811691 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.813324 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.813362 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.813373 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.818429 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.818241 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348"} Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.818695 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4"} Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.818717 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231"} Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.818443 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.821961 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.822088 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.822112 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.822242 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.822261 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.822275 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:23 crc kubenswrapper[4780]: W0929 18:43:23.937687 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Sep 29 18:43:23 crc kubenswrapper[4780]: E0929 18:43:23.937832 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.960226 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.961826 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.961869 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.961880 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:23 crc kubenswrapper[4780]: I0929 18:43:23.961911 4780 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 18:43:23 crc kubenswrapper[4780]: E0929 18:43:23.962274 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.349789 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.823669 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f"} Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.823994 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.825283 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.825320 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.825335 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.828901 4780 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d" exitCode=0 Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.829206 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d"} Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.829372 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.829393 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.829642 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.830835 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.830871 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.830982 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.832731 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.832818 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.832841 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.833160 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.833219 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:24 crc kubenswrapper[4780]: I0929 18:43:24.833232 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.838029 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439"} Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.838144 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa"} Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.838176 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0"} Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.838198 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a"} Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.838205 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.838286 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.838285 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.839847 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.839900 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.839915 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.839847 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.839975 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:25 crc kubenswrapper[4780]: I0929 18:43:25.839997 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.024260 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.263834 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.264174 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.270781 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.270911 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.270934 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.280072 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.359031 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.546306 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.848697 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.849905 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.850334 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81"} Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.850652 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.850904 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.850765 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.851308 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.851335 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.851101 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.851387 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.851408 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.852970 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.853062 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:26 crc kubenswrapper[4780]: I0929 18:43:26.853080 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.163428 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.165497 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.165571 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.165606 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.165652 4780 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.851632 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.851631 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.852864 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.852929 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.852947 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.853743 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.853778 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:27 crc kubenswrapper[4780]: I0929 18:43:27.853787 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.615298 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.615658 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.617513 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.617597 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.617624 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.848132 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.854639 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.856076 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.856145 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.856171 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.892583 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.893041 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.894775 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.894832 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:28 crc kubenswrapper[4780]: I0929 18:43:28.894857 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:29 crc kubenswrapper[4780]: I0929 18:43:29.546243 4780 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 18:43:29 crc kubenswrapper[4780]: I0929 18:43:29.546349 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 18:43:30 crc kubenswrapper[4780]: I0929 18:43:30.382163 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:30 crc kubenswrapper[4780]: I0929 18:43:30.382474 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:30 crc kubenswrapper[4780]: I0929 18:43:30.384692 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:30 crc kubenswrapper[4780]: I0929 18:43:30.384733 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:30 crc kubenswrapper[4780]: I0929 18:43:30.384745 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:30 crc kubenswrapper[4780]: E0929 18:43:30.860212 4780 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 29 18:43:32 crc kubenswrapper[4780]: I0929 18:43:32.321357 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 29 18:43:32 crc kubenswrapper[4780]: I0929 18:43:32.321657 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:32 crc kubenswrapper[4780]: I0929 18:43:32.323321 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:32 crc kubenswrapper[4780]: I0929 18:43:32.323432 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:32 crc kubenswrapper[4780]: I0929 18:43:32.323457 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:34 crc kubenswrapper[4780]: W0929 18:43:34.470082 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 29 18:43:34 crc kubenswrapper[4780]: I0929 18:43:34.470270 4780 trace.go:236] Trace[479659007]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 18:43:24.468) (total time: 10001ms): Sep 29 18:43:34 crc kubenswrapper[4780]: Trace[479659007]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:43:34.470) Sep 29 18:43:34 crc kubenswrapper[4780]: Trace[479659007]: [10.00166002s] [10.00166002s] END Sep 29 18:43:34 crc kubenswrapper[4780]: E0929 18:43:34.470323 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 29 18:43:34 crc kubenswrapper[4780]: W0929 18:43:34.577298 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 29 18:43:34 crc kubenswrapper[4780]: I0929 18:43:34.577433 4780 trace.go:236] Trace[1534159148]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 18:43:24.575) (total time: 10002ms): Sep 29 18:43:34 crc kubenswrapper[4780]: Trace[1534159148]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (18:43:34.577) Sep 29 18:43:34 crc kubenswrapper[4780]: Trace[1534159148]: [10.002219975s] [10.002219975s] END Sep 29 18:43:34 crc kubenswrapper[4780]: E0929 18:43:34.577467 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 29 18:43:34 crc kubenswrapper[4780]: I0929 18:43:34.685827 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 29 18:43:34 crc kubenswrapper[4780]: W0929 18:43:34.922910 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 29 18:43:34 crc kubenswrapper[4780]: I0929 18:43:34.923023 4780 trace.go:236] Trace[1863361158]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 18:43:24.921) (total time: 10001ms): Sep 29 18:43:34 crc kubenswrapper[4780]: Trace[1863361158]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:43:34.922) Sep 29 18:43:34 crc kubenswrapper[4780]: Trace[1863361158]: [10.001738442s] [10.001738442s] END Sep 29 18:43:34 crc kubenswrapper[4780]: E0929 18:43:34.923066 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 29 18:43:35 crc kubenswrapper[4780]: I0929 18:43:35.506464 4780 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 29 18:43:35 crc kubenswrapper[4780]: I0929 18:43:35.506569 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 29 18:43:35 crc kubenswrapper[4780]: I0929 18:43:35.513074 4780 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 29 18:43:35 crc kubenswrapper[4780]: I0929 18:43:35.513142 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 29 18:43:36 crc kubenswrapper[4780]: I0929 18:43:36.363205 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:36 crc kubenswrapper[4780]: I0929 18:43:36.363348 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:36 crc kubenswrapper[4780]: I0929 18:43:36.364333 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:36 crc kubenswrapper[4780]: I0929 18:43:36.364367 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:36 crc kubenswrapper[4780]: I0929 18:43:36.364379 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.857743 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.858025 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.859481 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.859527 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.859540 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.866211 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.885223 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.885289 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.886499 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.886545 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:38 crc kubenswrapper[4780]: I0929 18:43:38.886564 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:39 crc kubenswrapper[4780]: I0929 18:43:39.153810 4780 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 29 18:43:39 crc kubenswrapper[4780]: I0929 18:43:39.547187 4780 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 18:43:39 crc kubenswrapper[4780]: I0929 18:43:39.547307 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 18:43:39 crc kubenswrapper[4780]: I0929 18:43:39.821103 4780 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.519524 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.519769 4780 trace.go:236] Trace[1051523800]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Sep-2025 18:43:27.739) (total time: 12779ms): Sep 29 18:43:40 crc kubenswrapper[4780]: Trace[1051523800]: ---"Objects listed" error: 12779ms (18:43:40.519) Sep 29 18:43:40 crc kubenswrapper[4780]: Trace[1051523800]: [12.77996635s] [12.77996635s] END Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.519799 4780 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.523568 4780 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.524775 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.568323 4780 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42186->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.568323 4780 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45598->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.568429 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42186->192.168.126.11:17697: read: connection reset by peer" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.568448 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45598->192.168.126.11:17697: read: connection reset by peer" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.568875 4780 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.568904 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.674905 4780 apiserver.go:52] "Watching apiserver" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.677771 4780 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.678094 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.678583 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.678670 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.678682 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.678730 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.678741 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.678794 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.679418 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.679456 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.679569 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.682010 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.682147 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.682581 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.683535 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.683617 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.685308 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.685389 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.685441 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.685550 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.694465 4780 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.717787 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725330 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725392 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725422 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725454 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725478 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725504 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725526 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725549 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725600 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725627 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725648 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725674 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725697 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725725 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725755 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725773 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.725773 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.726676 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.727243 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.727545 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.727601 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.727711 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.727775 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.727827 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.727857 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728154 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728220 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728276 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728461 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728587 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728741 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728862 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728969 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728642 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.728942 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.729274 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.730527 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.730759 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.730858 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.729275 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.729379 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.729457 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.730917 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731304 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731340 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731328 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731691 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731768 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731778 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731821 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731858 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731889 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731923 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731956 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.731984 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732017 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732092 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732124 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732157 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732190 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732220 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732232 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732250 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732282 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732337 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732379 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732407 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732441 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732469 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732506 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732531 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732581 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732610 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732634 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732660 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732687 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732723 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732762 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732790 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732819 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732843 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732875 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732909 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732941 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732967 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732998 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733023 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733076 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733099 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733125 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733155 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733178 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733210 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733238 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733263 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733285 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733310 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733336 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733355 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733381 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733407 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732274 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732577 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732594 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733467 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732651 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732692 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733475 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732773 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732710 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733576 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733659 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733762 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733821 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733490 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733862 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733885 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733905 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733920 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733926 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733947 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733968 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733987 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734006 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734026 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734062 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734079 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734097 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734118 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734134 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734155 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734175 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734195 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734156 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734213 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734232 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734241 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734250 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734302 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734328 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734360 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734394 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734491 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734515 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734518 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734566 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734598 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734628 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734657 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734674 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734701 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732967 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733253 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733278 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.733439 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734801 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734863 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734690 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.734985 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735040 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735113 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735162 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735188 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735205 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735237 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735251 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735288 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735332 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735372 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735410 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735423 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735449 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735491 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735524 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735554 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735567 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735616 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735657 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735695 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735736 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735778 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735798 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735815 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735852 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735858 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735888 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.735901 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736027 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736082 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736112 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736138 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736157 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736176 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736208 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.732937 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736239 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736267 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736295 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736350 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736342 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736378 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736410 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736414 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736437 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736473 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736501 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736530 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736595 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736624 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736596 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736655 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736688 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.736747 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.737367 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.737488 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.737599 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.737731 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.737769 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.737901 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738090 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738106 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738144 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738254 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738265 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738295 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738326 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738362 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738399 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738348 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738429 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738474 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738548 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738586 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738598 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738616 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738655 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738673 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738688 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738719 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738752 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738783 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738809 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738744 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.738835 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.739020 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.739130 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.739087 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.739090 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.739712 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740090 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740054 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740126 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740132 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740299 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740384 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740391 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740492 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740507 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740598 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740924 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740929 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.741085 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.741118 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.741176 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.741777 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.742165 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.742299 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.742831 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.740579 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.742926 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.742972 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743003 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743027 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743065 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743089 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743115 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743135 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743155 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743182 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743186 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743202 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743311 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743357 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743404 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743387 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743471 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743488 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743507 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743520 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743620 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743651 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743667 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743688 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743714 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743738 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743788 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743818 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743839 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743860 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743896 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743935 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.743975 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744004 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744034 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744073 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744124 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744134 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744137 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744167 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744174 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744819 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.745066 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.745494 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.745538 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.745549 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.745911 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.746260 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.746683 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.746591 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.746760 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.746980 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.747022 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.747174 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.747440 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.747558 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.747597 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748013 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748081 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748219 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748374 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748384 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748711 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748768 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748899 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748999 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748905 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.748676 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.749290 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.749213 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.749215 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.749278 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.749691 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.749144 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.750101 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.750129 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.750242 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:43:41.25022244 +0000 UTC m=+21.198520484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.750641 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.750826 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.750971 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751165 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751178 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.744080 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751361 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751421 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751460 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751503 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751580 4780 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751940 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751654 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751370 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751622 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.751760 4780 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752320 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.751772 4780 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.752484 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:41.252455641 +0000 UTC m=+21.200753695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751779 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752535 4780 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751892 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.752552 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:41.252529923 +0000 UTC m=+21.200828187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752548 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751918 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.751972 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752005 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752597 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752618 4780 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752632 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752645 4780 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752659 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752675 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752690 4780 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752713 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752726 4780 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752739 4780 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752753 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752036 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752033 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752658 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752762 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752775 4780 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752909 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753036 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.752980 4780 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753180 4780 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753200 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753218 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753233 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753247 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753264 4780 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753279 4780 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753294 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753310 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753321 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753326 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753373 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753388 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753401 4780 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753416 4780 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753432 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753446 4780 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753461 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753475 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753487 4780 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753501 4780 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753514 4780 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753525 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753540 4780 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753552 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753565 4780 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753577 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753593 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753604 4780 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753617 4780 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753629 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753642 4780 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753653 4780 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753666 4780 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753679 4780 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753691 4780 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753704 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753717 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753732 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753745 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753680 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.753757 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754601 4780 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754626 4780 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754647 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754666 4780 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754686 4780 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754704 4780 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754722 4780 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754740 4780 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754763 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754781 4780 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754799 4780 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754817 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754835 4780 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754923 4780 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754941 4780 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754960 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754979 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754949 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755015 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754794 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755312 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.754996 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755390 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755410 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755543 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755429 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755587 4780 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755598 4780 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755611 4780 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755643 4780 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755656 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755669 4780 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755681 4780 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755690 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755723 4780 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755734 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755744 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755754 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755763 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755774 4780 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755805 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755815 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755824 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755835 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755846 4780 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755862 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755879 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755892 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755857 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755930 4780 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755958 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755969 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755979 4780 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755988 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.755998 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756008 4780 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756034 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756055 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756066 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756078 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756141 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756153 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756162 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756172 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756181 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756212 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756223 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756234 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756244 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756253 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756263 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756289 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756296 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756330 4780 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756371 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756384 4780 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756400 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756413 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756427 4780 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756439 4780 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756452 4780 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756463 4780 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756474 4780 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756486 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756498 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756510 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756521 4780 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756533 4780 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756546 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756557 4780 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756568 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756580 4780 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756592 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756606 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756620 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756632 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756645 4780 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.756657 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.759208 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.759283 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.760202 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.760245 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.760840 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.761714 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.762406 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.765616 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.765658 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.767669 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.767779 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.768440 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.768470 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.768486 4780 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.768552 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:41.268534173 +0000 UTC m=+21.216832227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.769312 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.769336 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.769349 4780 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:40 crc kubenswrapper[4780]: E0929 18:43:40.769389 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:41.269375566 +0000 UTC m=+21.217673770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.771261 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.771711 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.772477 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.772778 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.774093 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.774131 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.774401 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.774566 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.774598 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.776686 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.776983 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.778499 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.778565 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.778656 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.778854 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.779502 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.780338 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.780965 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.781189 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.782013 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.782753 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.783712 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.786721 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.788453 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.791159 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.792167 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.793108 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.794713 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.795612 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.797355 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.798916 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.799744 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.800931 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.800944 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.801571 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.801716 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.802657 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.803469 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.804039 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.804469 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.805177 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.806423 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.807370 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.808420 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.809669 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.813563 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.838165 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.846528 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.847352 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.848202 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.848810 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.851143 4780 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.854578 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857080 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857192 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857232 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857304 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857501 4780 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857528 4780 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857542 4780 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857557 4780 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857569 4780 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857610 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857624 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857637 4780 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857649 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857666 4780 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857680 4780 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857695 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857713 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857730 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857747 4780 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857770 4780 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857786 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857801 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857818 4780 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857886 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857912 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857924 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857937 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857950 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857963 4780 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857975 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857987 4780 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.857999 4780 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858012 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858025 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858038 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858063 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858074 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858083 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858094 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858105 4780 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858115 4780 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858127 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858137 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858146 4780 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858156 4780 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858165 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858176 4780 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858185 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858196 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858209 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858219 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858230 4780 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.858240 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.859349 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.865948 4780 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.866143 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.869980 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.880267 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.880389 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.881150 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.881712 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.882084 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.891430 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.900542 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.912989 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.920909 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.923795 4780 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f" exitCode=255 Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.951363 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.952265 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.952908 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.953623 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.954375 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.954939 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.955614 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.956321 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.958004 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.958679 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.958943 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.959554 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.960747 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.961559 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.962508 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.962978 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.963538 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.964622 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.965216 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.966143 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.966603 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f"} Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.979699 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.989968 4780 scope.go:117] "RemoveContainer" containerID="6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.990851 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.991725 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.991727 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:40 crc kubenswrapper[4780]: I0929 18:43:40.998262 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.001850 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.006172 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.012705 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.022370 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.037897 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.269386 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.269487 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.269522 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.269549 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.269595 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:43:42.269558942 +0000 UTC m=+22.217856996 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.269659 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.269671 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.269689 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.269703 4780 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.269763 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:42.269746487 +0000 UTC m=+22.218044531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.269806 4780 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.269854 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:42.26984411 +0000 UTC m=+22.218142344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.269967 4780 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.270000 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:42.269991524 +0000 UTC m=+22.218289758 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.270109 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.270146 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.270161 4780 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.270234 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:42.27021031 +0000 UTC m=+22.218508354 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.752082 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:41 crc kubenswrapper[4780]: E0929 18:43:41.752230 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.929019 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93"} Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.929093 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a45499c62585c358b04da6d8b608de4753ff0436126e95f50d6988dbd84f98ba"} Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.930647 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.932538 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224"} Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.932745 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.933648 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"10a8480b48cec8b8f48ee8ea36200e0f921fce7dc7919dca764d0e924dde2261"} Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.935385 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443"} Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.935415 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb"} Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.935430 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1876c18bdfe36f5c7b0ad3cea0f36d70f6089649cb74f76d74580a7a1a19d266"} Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.957214 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.968501 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.980687 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.995101 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:41 crc kubenswrapper[4780]: I0929 18:43:41.999697 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pgf7g"] Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.000127 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pgf7g" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.002565 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.006571 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.006850 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.018321 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.033661 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.048488 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.062041 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.074917 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.077131 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f58bd68b-97a1-4a2b-a772-c6f8a3ea2472-hosts-file\") pod \"node-resolver-pgf7g\" (UID: \"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\") " pod="openshift-dns/node-resolver-pgf7g" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.077168 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2brb\" (UniqueName: \"kubernetes.io/projected/f58bd68b-97a1-4a2b-a772-c6f8a3ea2472-kube-api-access-j2brb\") pod \"node-resolver-pgf7g\" (UID: \"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\") " pod="openshift-dns/node-resolver-pgf7g" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.116663 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.166647 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.178218 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f58bd68b-97a1-4a2b-a772-c6f8a3ea2472-hosts-file\") pod \"node-resolver-pgf7g\" (UID: \"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\") " pod="openshift-dns/node-resolver-pgf7g" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.178263 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2brb\" (UniqueName: \"kubernetes.io/projected/f58bd68b-97a1-4a2b-a772-c6f8a3ea2472-kube-api-access-j2brb\") pod \"node-resolver-pgf7g\" (UID: \"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\") " pod="openshift-dns/node-resolver-pgf7g" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.178375 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f58bd68b-97a1-4a2b-a772-c6f8a3ea2472-hosts-file\") pod \"node-resolver-pgf7g\" (UID: \"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\") " pod="openshift-dns/node-resolver-pgf7g" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.179495 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.194715 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.197331 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2brb\" (UniqueName: \"kubernetes.io/projected/f58bd68b-97a1-4a2b-a772-c6f8a3ea2472-kube-api-access-j2brb\") pod \"node-resolver-pgf7g\" (UID: \"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\") " pod="openshift-dns/node-resolver-pgf7g" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.217027 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.230238 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.278710 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.278805 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.278836 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.278866 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.278896 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.278990 4780 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279082 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:44.279038474 +0000 UTC m=+24.227336518 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279479 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:43:44.279467866 +0000 UTC m=+24.227765910 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279579 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279600 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279614 4780 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279644 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:44.27963544 +0000 UTC m=+24.227933484 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279699 4780 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279729 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:44.279717553 +0000 UTC m=+24.228015597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279787 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279800 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279810 4780 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.279839 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:44.279829156 +0000 UTC m=+24.228127200 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.313791 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pgf7g" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.359472 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.374321 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.400714 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.401429 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.434350 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.449119 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wc8rf"] Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.449708 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.458365 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.458583 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.458718 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.459310 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.459437 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.462571 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jrs9w"] Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.462864 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-gk8l9"] Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.463332 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.463585 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.468736 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.468886 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p7vtr"] Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.469019 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.469195 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.469378 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.469434 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.469589 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.469673 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.469763 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.469882 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.472499 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.474540 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.478472 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.478626 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.478726 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.479628 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.479799 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.516596 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.546550 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582423 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-system-cni-dir\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582459 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-cni-dir\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582475 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/772477ed-f72b-4cae-9042-d9284309476c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582490 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-cnibin\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582527 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-hostroot\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582544 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-env-overrides\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582562 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-var-lib-kubelet\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582584 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-var-lib-cni-bin\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582782 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67a6d63c-6762-464e-9216-a234506b74db-proxy-tls\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582825 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf7sn\" (UniqueName: \"kubernetes.io/projected/67a6d63c-6762-464e-9216-a234506b74db-kube-api-access-zf7sn\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582857 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-run-netns\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582934 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzswm\" (UniqueName: \"kubernetes.io/projected/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-kube-api-access-tzswm\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.582983 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-node-log\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583030 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67a6d63c-6762-464e-9216-a234506b74db-mcd-auth-proxy-config\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583087 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-cnibin\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583104 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-os-release\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583142 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583164 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-etc-kubernetes\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583179 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-slash\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583203 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-var-lib-openvswitch\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583230 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/67a6d63c-6762-464e-9216-a234506b74db-rootfs\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583245 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-cni-binary-copy\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583262 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-config\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583277 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43a328df-2763-44f9-9512-3abb64ef45aa-ovn-node-metrics-cert\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583302 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583317 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-os-release\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583332 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-script-lib\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583351 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-ovn\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583365 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r2sf\" (UniqueName: \"kubernetes.io/projected/43a328df-2763-44f9-9512-3abb64ef45aa-kube-api-access-5r2sf\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583382 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbfvp\" (UniqueName: \"kubernetes.io/projected/772477ed-f72b-4cae-9042-d9284309476c-kube-api-access-xbfvp\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583399 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-kubelet\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583419 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/772477ed-f72b-4cae-9042-d9284309476c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583435 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-bin\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583450 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-netd\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583471 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-daemon-config\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583484 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-netns\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583498 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-log-socket\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583517 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-var-lib-cni-multus\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583532 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-conf-dir\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583548 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-system-cni-dir\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583563 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-run-k8s-cni-cncf-io\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583581 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-openvswitch\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583597 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-run-multus-certs\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583611 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-systemd\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583640 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-etc-openvswitch\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583659 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-socket-dir-parent\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583674 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-systemd-units\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.583690 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.584358 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.609824 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.621774 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.636481 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.650692 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.660976 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.683381 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684066 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/772477ed-f72b-4cae-9042-d9284309476c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684102 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-bin\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684122 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-netd\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684168 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-var-lib-cni-multus\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684189 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-conf-dir\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684221 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-daemon-config\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684237 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-netns\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684235 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-bin\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684252 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-log-socket\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684311 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-log-socket\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684339 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-system-cni-dir\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684819 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-run-k8s-cni-cncf-io\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684865 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-run-multus-certs\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684923 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-systemd\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684953 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-etc-openvswitch\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.684976 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-openvswitch\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685002 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-socket-dir-parent\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685022 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-systemd-units\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685062 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685039 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-netd\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685086 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-cni-dir\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685148 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-var-lib-cni-multus\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685169 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-system-cni-dir\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685189 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-conf-dir\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685201 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/772477ed-f72b-4cae-9042-d9284309476c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685237 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-cnibin\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685265 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-hostroot\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685301 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-env-overrides\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685322 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-var-lib-kubelet\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685423 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-var-lib-kubelet\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685476 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/772477ed-f72b-4cae-9042-d9284309476c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685673 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-system-cni-dir\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685722 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-cnibin\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685755 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-netns\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685788 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-run-k8s-cni-cncf-io\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685829 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-socket-dir-parent\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685858 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-run-multus-certs\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685885 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-systemd\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685911 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-etc-openvswitch\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685938 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-openvswitch\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.685971 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-ovn-kubernetes\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686005 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-systemd-units\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686034 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-hostroot\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686038 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/772477ed-f72b-4cae-9042-d9284309476c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686144 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-cni-dir\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686187 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-system-cni-dir\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686432 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-multus-daemon-config\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686539 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-env-overrides\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686570 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67a6d63c-6762-464e-9216-a234506b74db-proxy-tls\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686608 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf7sn\" (UniqueName: \"kubernetes.io/projected/67a6d63c-6762-464e-9216-a234506b74db-kube-api-access-zf7sn\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686651 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-run-netns\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686676 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-var-lib-cni-bin\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686712 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67a6d63c-6762-464e-9216-a234506b74db-mcd-auth-proxy-config\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686739 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-cnibin\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686760 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-os-release\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686803 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzswm\" (UniqueName: \"kubernetes.io/projected/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-kube-api-access-tzswm\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686869 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-node-log\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.686945 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687026 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/67a6d63c-6762-464e-9216-a234506b74db-rootfs\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687085 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-cni-binary-copy\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687117 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-etc-kubernetes\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687165 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-slash\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687193 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-var-lib-openvswitch\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687241 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43a328df-2763-44f9-9512-3abb64ef45aa-ovn-node-metrics-cert\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687267 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-config\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687311 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687335 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-os-release\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687342 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-node-log\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687383 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-script-lib\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687415 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbfvp\" (UniqueName: \"kubernetes.io/projected/772477ed-f72b-4cae-9042-d9284309476c-kube-api-access-xbfvp\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687464 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-kubelet\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687487 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-ovn\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687515 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r2sf\" (UniqueName: \"kubernetes.io/projected/43a328df-2763-44f9-9512-3abb64ef45aa-kube-api-access-5r2sf\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687614 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-run-netns\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.687657 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-host-var-lib-cni-bin\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.688358 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67a6d63c-6762-464e-9216-a234506b74db-mcd-auth-proxy-config\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.688443 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-cnibin\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.688625 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-os-release\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.688637 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-config\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.688968 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-os-release\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.689157 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.689209 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-etc-kubernetes\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.689226 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-ovn\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.689256 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/67a6d63c-6762-464e-9216-a234506b74db-rootfs\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.689264 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-slash\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.689172 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-kubelet\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.689308 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-var-lib-openvswitch\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.689774 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-cni-binary-copy\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.690389 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/772477ed-f72b-4cae-9042-d9284309476c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.690752 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-script-lib\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.691645 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67a6d63c-6762-464e-9216-a234506b74db-proxy-tls\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.692009 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43a328df-2763-44f9-9512-3abb64ef45aa-ovn-node-metrics-cert\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.695919 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.706498 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbfvp\" (UniqueName: \"kubernetes.io/projected/772477ed-f72b-4cae-9042-d9284309476c-kube-api-access-xbfvp\") pod \"multus-additional-cni-plugins-gk8l9\" (UID: \"772477ed-f72b-4cae-9042-d9284309476c\") " pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.709632 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.709969 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzswm\" (UniqueName: \"kubernetes.io/projected/2c2af9fc-5cef-48e3-8070-cf2767bc4a81-kube-api-access-tzswm\") pod \"multus-wc8rf\" (UID: \"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\") " pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.712842 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf7sn\" (UniqueName: \"kubernetes.io/projected/67a6d63c-6762-464e-9216-a234506b74db-kube-api-access-zf7sn\") pod \"machine-config-daemon-jrs9w\" (UID: \"67a6d63c-6762-464e-9216-a234506b74db\") " pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.714163 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r2sf\" (UniqueName: \"kubernetes.io/projected/43a328df-2763-44f9-9512-3abb64ef45aa-kube-api-access-5r2sf\") pod \"ovnkube-node-p7vtr\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.724295 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.738076 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.753101 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.753137 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.753256 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.753465 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.753552 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.757597 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.758460 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.762283 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wc8rf" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.777192 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.794583 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.797302 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.809829 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.812909 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.832106 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: W0929 18:43:42.837407 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67a6d63c_6762_464e_9216_a234506b74db.slice/crio-2fe0f4f0bfb65c1e046cc375d82ee5f0c50eaf36d9149017aeae10b90cc490dd WatchSource:0}: Error finding container 2fe0f4f0bfb65c1e046cc375d82ee5f0c50eaf36d9149017aeae10b90cc490dd: Status 404 returned error can't find the container with id 2fe0f4f0bfb65c1e046cc375d82ee5f0c50eaf36d9149017aeae10b90cc490dd Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.837720 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:42 crc kubenswrapper[4780]: W0929 18:43:42.865768 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43a328df_2763_44f9_9512_3abb64ef45aa.slice/crio-ff6aee3711ee63e8d82b287ad9d225eeb8ccfdc00388a17604d9341baa1e1692 WatchSource:0}: Error finding container ff6aee3711ee63e8d82b287ad9d225eeb8ccfdc00388a17604d9341baa1e1692: Status 404 returned error can't find the container with id ff6aee3711ee63e8d82b287ad9d225eeb8ccfdc00388a17604d9341baa1e1692 Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.954626 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"2fe0f4f0bfb65c1e046cc375d82ee5f0c50eaf36d9149017aeae10b90cc490dd"} Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.956237 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" event={"ID":"772477ed-f72b-4cae-9042-d9284309476c","Type":"ContainerStarted","Data":"68d6335ca16dc342fcb8497170796bbfe50ca76779d4b501db6672bc7345c1c0"} Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.959069 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wc8rf" event={"ID":"2c2af9fc-5cef-48e3-8070-cf2767bc4a81","Type":"ContainerStarted","Data":"59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7"} Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.959092 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wc8rf" event={"ID":"2c2af9fc-5cef-48e3-8070-cf2767bc4a81","Type":"ContainerStarted","Data":"f27cbcf7a6bfda8761d86c7dbcb00165a00de88eed0b5ddf452fb2195f581d37"} Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.969101 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pgf7g" event={"ID":"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472","Type":"ContainerStarted","Data":"c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981"} Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.969132 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pgf7g" event={"ID":"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472","Type":"ContainerStarted","Data":"eb351df5d463eb8bb3ad487960e5407a996fc59485c86267a712cd4db205b71e"} Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.971597 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"ff6aee3711ee63e8d82b287ad9d225eeb8ccfdc00388a17604d9341baa1e1692"} Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.981030 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:42 crc kubenswrapper[4780]: E0929 18:43:42.983905 4780 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Sep 29 18:43:42 crc kubenswrapper[4780]: I0929 18:43:42.996763 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:42Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.012630 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.036975 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.062817 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.097252 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.157056 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.186419 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.205433 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.221294 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.242710 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.262782 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.279952 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.300483 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.327160 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.343102 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.360681 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.373334 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.393123 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.409234 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.424473 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.437505 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.450975 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.463596 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.473571 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.491199 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.752811 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:43 crc kubenswrapper[4780]: E0929 18:43:43.752994 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.977976 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11"} Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.978074 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5"} Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.980562 4780 generic.go:334] "Generic (PLEG): container finished" podID="772477ed-f72b-4cae-9042-d9284309476c" containerID="fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee" exitCode=0 Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.980624 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" event={"ID":"772477ed-f72b-4cae-9042-d9284309476c","Type":"ContainerDied","Data":"fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee"} Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.982449 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814" exitCode=0 Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.982545 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814"} Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.984500 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4"} Sep 29 18:43:43 crc kubenswrapper[4780]: I0929 18:43:43.998086 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:43Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.023100 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.038664 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.056882 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.074689 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.086085 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.109257 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.119586 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.132829 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.151082 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.161994 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.176794 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.187383 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.201700 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.219989 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.234498 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.248486 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.260542 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.273502 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.286928 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.301735 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.306920 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.307022 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307113 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:43:48.307090016 +0000 UTC m=+28.255388070 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.307159 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.307190 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307201 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307548 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307565 4780 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.307219 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307677 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:48.307664242 +0000 UTC m=+28.255962286 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307262 4780 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307729 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:48.307716953 +0000 UTC m=+28.256014997 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307273 4780 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307758 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:48.307752034 +0000 UTC m=+28.256050078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307492 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307777 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307787 4780 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.307822 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:48.307813606 +0000 UTC m=+28.256111650 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.314873 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.327803 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.338962 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.358116 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.367139 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:44Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.752562 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.752627 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.753370 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:44 crc kubenswrapper[4780]: E0929 18:43:44.753499 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.991642 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.991705 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.991718 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.991730 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.991741 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.993933 4780 generic.go:334] "Generic (PLEG): container finished" podID="772477ed-f72b-4cae-9042-d9284309476c" containerID="c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6" exitCode=0 Sep 29 18:43:44 crc kubenswrapper[4780]: I0929 18:43:44.994533 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" event={"ID":"772477ed-f72b-4cae-9042-d9284309476c","Type":"ContainerDied","Data":"c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6"} Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.008479 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.021271 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.038911 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.059923 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.073746 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.086931 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.102374 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.117124 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.129131 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.141021 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.155141 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.167786 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.184521 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.514303 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-f8mfd"] Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.514921 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.516975 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.517289 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.517679 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.518176 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.537563 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.551407 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.563906 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.579328 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.610400 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.624389 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjjjf\" (UniqueName: \"kubernetes.io/projected/4e46edd0-3650-4fbc-8ad6-d29defbd30de-kube-api-access-sjjjf\") pod \"node-ca-f8mfd\" (UID: \"4e46edd0-3650-4fbc-8ad6-d29defbd30de\") " pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.624432 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e46edd0-3650-4fbc-8ad6-d29defbd30de-host\") pod \"node-ca-f8mfd\" (UID: \"4e46edd0-3650-4fbc-8ad6-d29defbd30de\") " pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.624466 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4e46edd0-3650-4fbc-8ad6-d29defbd30de-serviceca\") pod \"node-ca-f8mfd\" (UID: \"4e46edd0-3650-4fbc-8ad6-d29defbd30de\") " pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.624990 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.638237 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.648695 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.659567 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.684619 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.697474 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.709429 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.721595 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.725831 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e46edd0-3650-4fbc-8ad6-d29defbd30de-host\") pod \"node-ca-f8mfd\" (UID: \"4e46edd0-3650-4fbc-8ad6-d29defbd30de\") " pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.725875 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4e46edd0-3650-4fbc-8ad6-d29defbd30de-serviceca\") pod \"node-ca-f8mfd\" (UID: \"4e46edd0-3650-4fbc-8ad6-d29defbd30de\") " pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.725935 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjjjf\" (UniqueName: \"kubernetes.io/projected/4e46edd0-3650-4fbc-8ad6-d29defbd30de-kube-api-access-sjjjf\") pod \"node-ca-f8mfd\" (UID: \"4e46edd0-3650-4fbc-8ad6-d29defbd30de\") " pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.725987 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e46edd0-3650-4fbc-8ad6-d29defbd30de-host\") pod \"node-ca-f8mfd\" (UID: \"4e46edd0-3650-4fbc-8ad6-d29defbd30de\") " pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.727266 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4e46edd0-3650-4fbc-8ad6-d29defbd30de-serviceca\") pod \"node-ca-f8mfd\" (UID: \"4e46edd0-3650-4fbc-8ad6-d29defbd30de\") " pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.731584 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:45Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.744959 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjjjf\" (UniqueName: \"kubernetes.io/projected/4e46edd0-3650-4fbc-8ad6-d29defbd30de-kube-api-access-sjjjf\") pod \"node-ca-f8mfd\" (UID: \"4e46edd0-3650-4fbc-8ad6-d29defbd30de\") " pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.752448 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:45 crc kubenswrapper[4780]: E0929 18:43:45.752613 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:45 crc kubenswrapper[4780]: I0929 18:43:45.832742 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f8mfd" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.001494 4780 generic.go:334] "Generic (PLEG): container finished" podID="772477ed-f72b-4cae-9042-d9284309476c" containerID="762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067" exitCode=0 Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.001654 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" event={"ID":"772477ed-f72b-4cae-9042-d9284309476c","Type":"ContainerDied","Data":"762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067"} Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.009265 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.011034 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f8mfd" event={"ID":"4e46edd0-3650-4fbc-8ad6-d29defbd30de","Type":"ContainerStarted","Data":"fb1c470f9445633de9e3a123139e199eeb4155733439504497e03c31cde2b658"} Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.019647 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.034195 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.045380 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.058250 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.071724 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.085264 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.097859 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.119290 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.130971 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.144589 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.161037 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.179656 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.194717 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.208477 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.552810 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.557400 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.561606 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.566186 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.578161 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.589535 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.602650 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.615194 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.633781 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.652184 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.673223 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.686034 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.699437 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.717384 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.730491 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.743225 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.752755 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.752859 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:46 crc kubenswrapper[4780]: E0929 18:43:46.753126 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:46 crc kubenswrapper[4780]: E0929 18:43:46.753511 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.759602 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.773140 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.790759 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.802180 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.817576 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.840153 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.855168 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.867615 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.880804 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.921277 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.925290 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.927443 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.927494 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.927512 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.927889 4780 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.944217 4780 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.944590 4780 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.946453 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.946504 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.946519 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.947005 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.947098 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:46Z","lastTransitionTime":"2025-09-29T18:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.950979 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.968034 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: E0929 18:43:46.970497 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.974895 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.974943 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.974954 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.974973 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.974985 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:46Z","lastTransitionTime":"2025-09-29T18:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.981651 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: E0929 18:43:46.988909 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.992970 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.993011 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.993021 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.993040 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.993070 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:46Z","lastTransitionTime":"2025-09-29T18:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:46 crc kubenswrapper[4780]: I0929 18:43:46.994682 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:46Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: E0929 18:43:47.003984 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.007537 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.007566 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.007574 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.007589 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.007598 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.009500 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.015981 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f8mfd" event={"ID":"4e46edd0-3650-4fbc-8ad6-d29defbd30de","Type":"ContainerStarted","Data":"9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.018566 4780 generic.go:334] "Generic (PLEG): container finished" podID="772477ed-f72b-4cae-9042-d9284309476c" containerID="81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b" exitCode=0 Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.018632 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" event={"ID":"772477ed-f72b-4cae-9042-d9284309476c","Type":"ContainerDied","Data":"81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b"} Sep 29 18:43:47 crc kubenswrapper[4780]: E0929 18:43:47.020558 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.027770 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.027814 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.027825 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.027858 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.027869 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.040904 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: E0929 18:43:47.041063 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: E0929 18:43:47.041666 4780 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.044104 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.044232 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.044308 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.044386 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.044472 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.057419 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.082757 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.123253 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.147828 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.147867 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.147877 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.147893 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.147904 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.160248 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.209713 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.240513 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.250529 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.250579 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.250587 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.250606 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.250616 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.279448 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.321272 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.353206 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.353297 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.353322 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.353357 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.353383 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.370328 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.406507 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.441838 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.456523 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.456594 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.456615 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.456643 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.456662 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.485233 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.526312 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.560177 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.560241 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.560254 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.560286 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.560307 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.569403 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.601496 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:47Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.663257 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.663405 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.663496 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.663582 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.663658 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.752911 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:47 crc kubenswrapper[4780]: E0929 18:43:47.753561 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.766471 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.766510 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.766523 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.766543 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.766557 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.870177 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.870218 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.870232 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.870253 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.870267 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.974156 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.974209 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.974221 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.974242 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:47 crc kubenswrapper[4780]: I0929 18:43:47.974257 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:47Z","lastTransitionTime":"2025-09-29T18:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.028522 4780 generic.go:334] "Generic (PLEG): container finished" podID="772477ed-f72b-4cae-9042-d9284309476c" containerID="ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4" exitCode=0 Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.028640 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" event={"ID":"772477ed-f72b-4cae-9042-d9284309476c","Type":"ContainerDied","Data":"ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.042616 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.068510 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.079712 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.079747 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.079764 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.079790 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.079807 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:48Z","lastTransitionTime":"2025-09-29T18:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.087680 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.113259 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.131165 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.150024 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.166402 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.182788 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.182819 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.182829 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.182847 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.182858 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:48Z","lastTransitionTime":"2025-09-29T18:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.183370 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.196118 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.210285 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.223302 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.240470 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.256000 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.277000 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.285143 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.285190 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.285202 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.285222 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.285234 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:48Z","lastTransitionTime":"2025-09-29T18:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.290883 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.304359 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.355240 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.355372 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.355462 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:43:56.355435192 +0000 UTC m=+36.303733236 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.355502 4780 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.355542 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:56.355533555 +0000 UTC m=+36.303831599 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.355499 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.355602 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.355638 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.355651 4780 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.355687 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:56.355679599 +0000 UTC m=+36.303977633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.355729 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.355755 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.355933 4780 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.356076 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:56.356028119 +0000 UTC m=+36.304326223 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.356677 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.356702 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.356718 4780 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.356763 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:56.356751039 +0000 UTC m=+36.305049083 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.388270 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.388347 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.388409 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.388435 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.388451 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:48Z","lastTransitionTime":"2025-09-29T18:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.490839 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.490884 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.490895 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.490916 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.490929 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:48Z","lastTransitionTime":"2025-09-29T18:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.594283 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.594334 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.594348 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.594369 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.594383 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:48Z","lastTransitionTime":"2025-09-29T18:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.698942 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.699000 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.699020 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.699071 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.699090 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:48Z","lastTransitionTime":"2025-09-29T18:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.752233 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.752330 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.752491 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:48 crc kubenswrapper[4780]: E0929 18:43:48.752735 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.802083 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.802136 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.802148 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.802170 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.802184 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:48Z","lastTransitionTime":"2025-09-29T18:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.905415 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.905478 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.905495 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.905519 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:48 crc kubenswrapper[4780]: I0929 18:43:48.905541 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:48Z","lastTransitionTime":"2025-09-29T18:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.009706 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.009764 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.009782 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.009808 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.009828 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.061001 4780 generic.go:334] "Generic (PLEG): container finished" podID="772477ed-f72b-4cae-9042-d9284309476c" containerID="a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570" exitCode=0 Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.061072 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" event={"ID":"772477ed-f72b-4cae-9042-d9284309476c","Type":"ContainerDied","Data":"a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.080874 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.106184 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.113125 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.113160 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.113170 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.113187 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.113201 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.124237 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.138500 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.162689 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.181792 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.195544 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.216332 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.216375 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.216388 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.216407 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.216418 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.219729 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.232963 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.245694 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.259994 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.276935 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.290427 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.306894 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.317491 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:49Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.318834 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.318855 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.318864 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.318880 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.318890 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.421814 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.421899 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.421917 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.421942 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.421985 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.524380 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.524443 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.524460 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.524490 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.524509 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.628298 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.628379 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.628405 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.628446 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.628474 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.731488 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.731528 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.731537 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.731553 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.731565 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.752161 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:49 crc kubenswrapper[4780]: E0929 18:43:49.752441 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.833831 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.833869 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.833879 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.833898 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.833909 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.937469 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.937573 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.937616 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.937653 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:49 crc kubenswrapper[4780]: I0929 18:43:49.937676 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:49Z","lastTransitionTime":"2025-09-29T18:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.040850 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.040914 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.040929 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.040954 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.040969 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.070940 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.071269 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.076697 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" event={"ID":"772477ed-f72b-4cae-9042-d9284309476c","Type":"ContainerStarted","Data":"abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.093540 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.099006 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.108875 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.123526 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.141893 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.145490 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.145532 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.145544 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.145566 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.145582 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.173457 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.190796 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.203794 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.226189 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.248202 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.248300 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.248328 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.248365 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.248392 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.252648 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.270839 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.291621 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.310585 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.330851 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.351068 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.351114 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.351126 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.351147 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.351159 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.355038 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.368933 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.390336 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.402619 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.414385 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.427894 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.444437 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.454270 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.454318 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.454332 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.454356 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.454371 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.463574 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.479132 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.491457 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.504499 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.522003 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.536692 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.549061 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.556446 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.556481 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.556492 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.556516 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.556527 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.572580 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.588235 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.598772 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.659034 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.659091 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.659099 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.659120 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.659130 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.752731 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.752841 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:50 crc kubenswrapper[4780]: E0929 18:43:50.753009 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:50 crc kubenswrapper[4780]: E0929 18:43:50.753305 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.765766 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.765811 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.765837 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.765852 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.765865 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.779384 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.797095 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.830403 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.850162 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.868578 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.868681 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.868713 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.868754 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.868778 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.874588 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.891349 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.907745 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.921440 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.937881 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.950385 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.963872 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.970841 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.970901 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.970917 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.970942 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.970960 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:50Z","lastTransitionTime":"2025-09-29T18:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.977306 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:50 crc kubenswrapper[4780]: I0929 18:43:50.994725 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.005853 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.046644 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.074230 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.074284 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.074295 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.074317 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.074330 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:51Z","lastTransitionTime":"2025-09-29T18:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.080502 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.081166 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.129567 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.143370 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.154026 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.166851 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.176577 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.176624 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.176633 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.176649 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.176661 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:51Z","lastTransitionTime":"2025-09-29T18:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.181207 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.195333 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.206170 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.226975 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.243349 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.253140 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.270635 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.279286 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.279318 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.279327 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.279346 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.279358 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:51Z","lastTransitionTime":"2025-09-29T18:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.302464 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.320571 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.333277 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.351925 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.376974 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.381874 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.381929 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.381964 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.381986 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.382005 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:51Z","lastTransitionTime":"2025-09-29T18:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.484921 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.484975 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.484995 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.485015 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.485028 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:51Z","lastTransitionTime":"2025-09-29T18:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.587910 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.587978 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.587990 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.588016 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.588030 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:51Z","lastTransitionTime":"2025-09-29T18:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.690519 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.690569 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.690580 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.690598 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.690608 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:51Z","lastTransitionTime":"2025-09-29T18:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.752473 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:51 crc kubenswrapper[4780]: E0929 18:43:51.752677 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.793988 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.794039 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.794062 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.794081 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.794092 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:51Z","lastTransitionTime":"2025-09-29T18:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.897088 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.897139 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.897156 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.897181 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:51 crc kubenswrapper[4780]: I0929 18:43:51.897197 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:51Z","lastTransitionTime":"2025-09-29T18:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.000220 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.000272 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.000283 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.000301 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.000314 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.083391 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.103997 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.104536 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.104551 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.104571 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.104586 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.206869 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.206918 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.206932 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.206951 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.206965 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.333350 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.333394 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.333402 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.333418 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.333430 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.435917 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.435968 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.435980 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.435999 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.436011 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.543183 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.543242 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.543257 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.543281 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.543295 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.646345 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.646387 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.646396 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.646410 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.646421 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.749607 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.749681 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.749704 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.749735 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.749755 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.753115 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.753204 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:52 crc kubenswrapper[4780]: E0929 18:43:52.753358 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:52 crc kubenswrapper[4780]: E0929 18:43:52.753650 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.852450 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.852492 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.852503 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.852524 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.852540 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.956426 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.956514 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.956529 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.956558 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:52 crc kubenswrapper[4780]: I0929 18:43:52.956573 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:52Z","lastTransitionTime":"2025-09-29T18:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.060340 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.060403 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.060423 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.060451 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.060471 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.090137 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/0.log" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.093174 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5" exitCode=1 Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.093240 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.094536 4780 scope.go:117] "RemoveContainer" containerID="55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.115630 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.129329 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.147482 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.164409 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.164474 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.164499 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.164531 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.164547 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.165591 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.189229 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:52Z\\\",\\\"message\\\":\\\" 6080 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 18:43:52.274083 6080 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274279 6080 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274566 6080 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274622 6080 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274824 6080 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274938 6080 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 18:43:52.275191 6080 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 18:43:52.275498 6080 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.208317 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.222733 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.245932 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.267585 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.267628 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.267637 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.267683 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.267694 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.279839 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.301240 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.321211 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.336400 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.355571 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.374654 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.374716 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.374735 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.374762 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.374781 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.377897 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.386436 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.390171 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:53Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.479614 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.479671 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.479682 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.479702 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.479716 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.582163 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.582200 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.582208 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.582226 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.582237 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.684783 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.684833 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.684844 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.684864 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.684897 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.753110 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:53 crc kubenswrapper[4780]: E0929 18:43:53.753297 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.788440 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.788489 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.788501 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.788519 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.788535 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.891179 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.891215 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.891226 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.891247 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.891260 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.994384 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.994437 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.994451 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.994475 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:53 crc kubenswrapper[4780]: I0929 18:43:53.994488 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:53Z","lastTransitionTime":"2025-09-29T18:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.096653 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.096690 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.096703 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.096722 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.096736 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:54Z","lastTransitionTime":"2025-09-29T18:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.100561 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/0.log" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.103564 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.104120 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.121662 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.137712 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.154386 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.169453 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.188063 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.199876 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.199939 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.199953 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.199977 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.199992 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:54Z","lastTransitionTime":"2025-09-29T18:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.204787 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.224643 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:52Z\\\",\\\"message\\\":\\\" 6080 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 18:43:52.274083 6080 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274279 6080 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274566 6080 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274622 6080 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274824 6080 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274938 6080 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 18:43:52.275191 6080 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 18:43:52.275498 6080 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.238534 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.251033 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.266456 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.293442 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.302643 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.302892 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.303078 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.303217 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.303363 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:54Z","lastTransitionTime":"2025-09-29T18:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.309130 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.325717 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.342084 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.357770 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:54Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.405556 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.405591 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.405602 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.405618 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.405630 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:54Z","lastTransitionTime":"2025-09-29T18:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.508901 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.508986 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.509009 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.509042 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.509113 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:54Z","lastTransitionTime":"2025-09-29T18:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.612809 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.613328 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.613587 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.613806 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.613991 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:54Z","lastTransitionTime":"2025-09-29T18:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.717588 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.717657 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.717674 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.717702 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.717721 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:54Z","lastTransitionTime":"2025-09-29T18:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.752286 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.752405 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:54 crc kubenswrapper[4780]: E0929 18:43:54.752525 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:54 crc kubenswrapper[4780]: E0929 18:43:54.752656 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.821460 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.821528 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.821546 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.821573 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.821593 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:54Z","lastTransitionTime":"2025-09-29T18:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.926686 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.927181 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.927324 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.927480 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:54 crc kubenswrapper[4780]: I0929 18:43:54.927695 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:54Z","lastTransitionTime":"2025-09-29T18:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.032240 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.032301 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.032320 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.032345 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.032363 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.116688 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/1.log" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.118778 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/0.log" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.123159 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc" exitCode=1 Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.123228 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.123310 4780 scope.go:117] "RemoveContainer" containerID="55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.124573 4780 scope.go:117] "RemoveContainer" containerID="9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc" Sep 29 18:43:55 crc kubenswrapper[4780]: E0929 18:43:55.124929 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.136840 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.136945 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.137011 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.137097 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.137124 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.155516 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.179732 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.195589 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.211165 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.222773 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.238530 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.241713 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.241744 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.241755 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.241922 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.241945 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.255013 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.264556 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.278023 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.294096 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.316553 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.331903 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.345690 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.345742 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.345753 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.345774 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.345788 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.361080 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:52Z\\\",\\\"message\\\":\\\" 6080 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 18:43:52.274083 6080 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274279 6080 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274566 6080 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274622 6080 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274824 6080 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274938 6080 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 18:43:52.275191 6080 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 18:43:52.275498 6080 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:54Z\\\",\\\"message\\\":\\\".ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI0929 18:43:54.046886 6220 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:43:54.046484 6220 services_controller.go:443] Built service openshift-service-ca-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.40\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 18:43:54.046950 6220 services_controller.go:444] Built service openshift-service-ca-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0929 18:43:54.046961 6220 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.377673 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.396967 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.449140 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.449207 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.449225 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.449247 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.449268 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.553088 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.553179 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.553204 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.553242 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.553269 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.658190 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.658484 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.658508 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.658541 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.658562 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.745467 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz"] Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.746408 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.750481 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.750870 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.752468 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:55 crc kubenswrapper[4780]: E0929 18:43:55.752764 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.762634 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.762676 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.762697 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.762721 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.762741 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.768661 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.768751 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.768844 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rwr6\" (UniqueName: \"kubernetes.io/projected/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-kube-api-access-8rwr6\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.768955 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.778181 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.800553 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.822782 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.852201 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.866430 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.866513 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.866533 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.866560 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.866582 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.870194 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.870293 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.870356 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.870411 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rwr6\" (UniqueName: \"kubernetes.io/projected/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-kube-api-access-8rwr6\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.871646 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.871986 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.881226 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.893346 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.902852 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rwr6\" (UniqueName: \"kubernetes.io/projected/cbcc9dd3-6eaf-4833-92f1-d126a87bbd49-kube-api-access-8rwr6\") pod \"ovnkube-control-plane-749d76644c-5smhz\" (UID: \"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.925636 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.949708 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.969446 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.969503 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.969514 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.969537 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.969553 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:55Z","lastTransitionTime":"2025-09-29T18:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.971314 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:55 crc kubenswrapper[4780]: I0929 18:43:55.990986 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:55Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.023640 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55d464603a70641b29897b94f0df44489cdca105f7d3b0690efddcf3af963eb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:52Z\\\",\\\"message\\\":\\\" 6080 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0929 18:43:52.274083 6080 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274279 6080 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274566 6080 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274622 6080 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274824 6080 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0929 18:43:52.274938 6080 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 18:43:52.275191 6080 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0929 18:43:52.275498 6080 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:54Z\\\",\\\"message\\\":\\\".ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI0929 18:43:54.046886 6220 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:43:54.046484 6220 services_controller.go:443] Built service openshift-service-ca-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.40\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 18:43:54.046950 6220 services_controller.go:444] Built service openshift-service-ca-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0929 18:43:54.046961 6220 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.046700 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.067692 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.073526 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.073620 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.073646 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.073680 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.073704 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:56Z","lastTransitionTime":"2025-09-29T18:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.074264 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.090652 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.117760 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.130630 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/1.log" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.143273 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.143507 4780 scope.go:117] "RemoveContainer" containerID="9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc" Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.145946 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.149525 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" event={"ID":"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49","Type":"ContainerStarted","Data":"2d5d105efbcdc8b691a6ff03d2bae712778cf684797841e9b7728e8fda5e1a8c"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.163316 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.179338 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.179399 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.179418 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.179440 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.179454 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:56Z","lastTransitionTime":"2025-09-29T18:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.184024 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.198922 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.215561 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.229793 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.247131 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.264800 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.282495 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.282547 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.282563 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.282589 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.282605 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:56Z","lastTransitionTime":"2025-09-29T18:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.294295 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:54Z\\\",\\\"message\\\":\\\".ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI0929 18:43:54.046886 6220 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:43:54.046484 6220 services_controller.go:443] Built service openshift-service-ca-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.40\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 18:43:54.046950 6220 services_controller.go:444] Built service openshift-service-ca-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0929 18:43:54.046961 6220 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.311522 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.322708 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.334592 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.348385 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.372086 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.375258 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.375349 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.375399 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375429 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:44:12.375405361 +0000 UTC m=+52.323703405 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.375473 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375486 4780 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375492 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.375507 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375516 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375555 4780 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375574 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375600 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375610 4780 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375527 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:12.375517854 +0000 UTC m=+52.323815908 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375659 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:12.375648527 +0000 UTC m=+52.323946571 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375674 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:12.375667138 +0000 UTC m=+52.323965182 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375704 4780 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.375826 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:12.375787001 +0000 UTC m=+52.324085055 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.385627 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.385660 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.385671 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.385688 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.385699 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:56Z","lastTransitionTime":"2025-09-29T18:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.385782 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.399852 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.416511 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.435532 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.488496 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.488543 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.488555 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.488573 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.488586 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:56Z","lastTransitionTime":"2025-09-29T18:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.592215 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.592262 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.592280 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.592301 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.592312 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:56Z","lastTransitionTime":"2025-09-29T18:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.695914 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.695985 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.696006 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.696035 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.696094 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:56Z","lastTransitionTime":"2025-09-29T18:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.752701 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.752816 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.752963 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.753150 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.799429 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.799505 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.799526 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.799558 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.799588 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:56Z","lastTransitionTime":"2025-09-29T18:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.902304 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.902366 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.902375 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.902392 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.902406 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:56Z","lastTransitionTime":"2025-09-29T18:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.934828 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-j6vxr"] Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.935744 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:43:56 crc kubenswrapper[4780]: E0929 18:43:56.935874 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.955713 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.978642 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.982430 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcbf5\" (UniqueName: \"kubernetes.io/projected/f7b75391-2034-4284-b779-eb7b1e9da774-kube-api-access-tcbf5\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.982608 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:43:56 crc kubenswrapper[4780]: I0929 18:43:56.996106 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:56Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.006128 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.006204 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.006227 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.006257 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.006281 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.017650 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.070594 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.083890 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcbf5\" (UniqueName: \"kubernetes.io/projected/f7b75391-2034-4284-b779-eb7b1e9da774-kube-api-access-tcbf5\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.083964 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.084160 4780 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.084225 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs podName:f7b75391-2034-4284-b779-eb7b1e9da774 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:57.584207631 +0000 UTC m=+37.532505685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs") pod "network-metrics-daemon-j6vxr" (UID: "f7b75391-2034-4284-b779-eb7b1e9da774") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.110279 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.110729 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.110758 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.110775 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.110800 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.110816 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.121457 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcbf5\" (UniqueName: \"kubernetes.io/projected/f7b75391-2034-4284-b779-eb7b1e9da774-kube-api-access-tcbf5\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.131663 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.150447 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.150449 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.150493 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.150623 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.150649 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.150660 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.154429 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" event={"ID":"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49","Type":"ContainerStarted","Data":"e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.154481 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" event={"ID":"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49","Type":"ContainerStarted","Data":"8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.164199 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.164275 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.167999 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.168061 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.168075 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.168094 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.168107 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.180408 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.184213 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.184258 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.184268 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.184290 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.184300 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.188017 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:54Z\\\",\\\"message\\\":\\\".ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI0929 18:43:54.046886 6220 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:43:54.046484 6220 services_controller.go:443] Built service openshift-service-ca-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.40\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 18:43:54.046950 6220 services_controller.go:444] Built service openshift-service-ca-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0929 18:43:54.046961 6220 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.198935 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.199562 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.203173 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.203221 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.203233 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.203259 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.203273 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.214537 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.220808 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.224515 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.224557 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.224570 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.224592 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.224608 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.229863 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.240136 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.242458 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.242569 4780 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.244483 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.244510 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.244525 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.244547 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.244561 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.253781 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.264713 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.276793 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.291892 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.306651 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.324522 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.337757 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.347113 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.347145 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.347153 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.347170 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.347179 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.359494 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:54Z\\\",\\\"message\\\":\\\".ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI0929 18:43:54.046886 6220 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:43:54.046484 6220 services_controller.go:443] Built service openshift-service-ca-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.40\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 18:43:54.046950 6220 services_controller.go:444] Built service openshift-service-ca-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0929 18:43:54.046961 6220 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.376582 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.390408 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.404228 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.415920 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.436423 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.450120 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.450214 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.450233 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.450262 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.450280 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.453653 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.472222 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.492951 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.522397 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.547939 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.553279 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.553328 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.553345 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.553370 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.553389 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.573432 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.589290 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.589610 4780 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.589726 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs podName:f7b75391-2034-4284-b779-eb7b1e9da774 nodeName:}" failed. No retries permitted until 2025-09-29 18:43:58.589702 +0000 UTC m=+38.538000054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs") pod "network-metrics-daemon-j6vxr" (UID: "f7b75391-2034-4284-b779-eb7b1e9da774") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.591705 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:57Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.657179 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.657303 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.657320 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.657343 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.657358 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.752435 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:57 crc kubenswrapper[4780]: E0929 18:43:57.752697 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.761670 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.761739 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.761759 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.761786 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.761806 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.864884 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.864937 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.864958 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.864984 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.865004 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.968008 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.968146 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.968174 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.968214 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:57 crc kubenswrapper[4780]: I0929 18:43:57.968242 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:57Z","lastTransitionTime":"2025-09-29T18:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.071916 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.071965 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.071980 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.072005 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.072022 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:58Z","lastTransitionTime":"2025-09-29T18:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.176098 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.176159 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.176177 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.176200 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.176217 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:58Z","lastTransitionTime":"2025-09-29T18:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.280265 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.280315 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.280332 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.280357 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.280376 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:58Z","lastTransitionTime":"2025-09-29T18:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.383792 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.383866 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.383889 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.383919 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.383940 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:58Z","lastTransitionTime":"2025-09-29T18:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.487137 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.487220 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.487239 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.487267 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.487284 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:58Z","lastTransitionTime":"2025-09-29T18:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.590304 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.590359 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.590378 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.590410 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.590436 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:58Z","lastTransitionTime":"2025-09-29T18:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.601365 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:43:58 crc kubenswrapper[4780]: E0929 18:43:58.601571 4780 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:43:58 crc kubenswrapper[4780]: E0929 18:43:58.601694 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs podName:f7b75391-2034-4284-b779-eb7b1e9da774 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:00.601657408 +0000 UTC m=+40.549955492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs") pod "network-metrics-daemon-j6vxr" (UID: "f7b75391-2034-4284-b779-eb7b1e9da774") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.622520 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.644312 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.659664 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.674424 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.693592 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.693669 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.693688 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.693718 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.693736 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:58Z","lastTransitionTime":"2025-09-29T18:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.695331 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.726536 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:54Z\\\",\\\"message\\\":\\\".ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI0929 18:43:54.046886 6220 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:43:54.046484 6220 services_controller.go:443] Built service openshift-service-ca-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.40\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 18:43:54.046950 6220 services_controller.go:444] Built service openshift-service-ca-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0929 18:43:54.046961 6220 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.743157 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.752806 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.752904 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:43:58 crc kubenswrapper[4780]: E0929 18:43:58.753091 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:43:58 crc kubenswrapper[4780]: E0929 18:43:58.753275 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.753502 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:43:58 crc kubenswrapper[4780]: E0929 18:43:58.753658 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.762811 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.780901 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.795112 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.797256 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.797302 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.797320 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.797350 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.797374 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:58Z","lastTransitionTime":"2025-09-29T18:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.819552 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.846973 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.866759 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.887702 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.901259 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.901306 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.901319 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.901339 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.901352 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:58Z","lastTransitionTime":"2025-09-29T18:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.910464 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.929983 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.946773 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:58 crc kubenswrapper[4780]: I0929 18:43:58.960937 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:43:58Z is after 2025-08-24T17:21:41Z" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.004159 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.004201 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.004213 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.004233 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.004244 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.107702 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.107739 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.107750 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.107768 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.107779 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.211444 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.211508 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.211525 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.211555 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.211573 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.315289 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.315388 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.315411 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.315450 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.315478 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.420616 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.420681 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.420691 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.420714 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.420728 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.524412 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.524501 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.524523 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.524568 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.524592 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.628460 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.628542 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.628562 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.628588 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.628606 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.732677 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.732724 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.732732 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.732750 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.732760 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.752022 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:43:59 crc kubenswrapper[4780]: E0929 18:43:59.752231 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.837247 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.837322 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.837342 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.837369 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.837386 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.940384 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.940444 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.940456 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.940484 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:43:59 crc kubenswrapper[4780]: I0929 18:43:59.940498 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:43:59Z","lastTransitionTime":"2025-09-29T18:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.045471 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.045575 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.045603 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.045644 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.045679 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.150034 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.150137 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.150155 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.150183 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.150205 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.253826 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.253885 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.253920 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.253951 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.253971 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.358285 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.358354 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.358374 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.358404 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.359024 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.463618 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.463681 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.463698 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.463726 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.463745 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.566557 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.566929 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.566940 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.566960 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.566972 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.626550 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:00 crc kubenswrapper[4780]: E0929 18:44:00.626737 4780 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:44:00 crc kubenswrapper[4780]: E0929 18:44:00.626814 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs podName:f7b75391-2034-4284-b779-eb7b1e9da774 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:04.626796549 +0000 UTC m=+44.575094593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs") pod "network-metrics-daemon-j6vxr" (UID: "f7b75391-2034-4284-b779-eb7b1e9da774") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.670186 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.670237 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.670248 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.670268 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.670281 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.752736 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.752781 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.752977 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:00 crc kubenswrapper[4780]: E0929 18:44:00.752966 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:00 crc kubenswrapper[4780]: E0929 18:44:00.753232 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:00 crc kubenswrapper[4780]: E0929 18:44:00.753450 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.773642 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.774162 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.774233 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.774258 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.774289 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.774317 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.788273 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.799942 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.815987 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.835789 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.848695 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.865998 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.877142 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.877403 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.877519 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.877647 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.877791 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.883147 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.898291 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.912601 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.942450 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:54Z\\\",\\\"message\\\":\\\".ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI0929 18:43:54.046886 6220 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:43:54.046484 6220 services_controller.go:443] Built service openshift-service-ca-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.40\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 18:43:54.046950 6220 services_controller.go:444] Built service openshift-service-ca-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0929 18:43:54.046961 6220 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.958102 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.971896 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.980308 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.980344 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.980358 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.980399 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:00 crc kubenswrapper[4780]: I0929 18:44:00.980428 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:00Z","lastTransitionTime":"2025-09-29T18:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.000072 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:00Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.015777 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:01Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.037926 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:01Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.049931 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:01Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.083574 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.083609 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.083620 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.083669 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.083684 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:01Z","lastTransitionTime":"2025-09-29T18:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.186291 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.186357 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.186380 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.186407 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.186426 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:01Z","lastTransitionTime":"2025-09-29T18:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.289182 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.289224 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.289235 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.289254 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.289268 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:01Z","lastTransitionTime":"2025-09-29T18:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.392584 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.392633 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.392644 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.392661 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.392671 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:01Z","lastTransitionTime":"2025-09-29T18:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.495455 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.495497 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.495508 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.495525 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.495536 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:01Z","lastTransitionTime":"2025-09-29T18:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.599091 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.599160 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.599212 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.599238 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.599293 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:01Z","lastTransitionTime":"2025-09-29T18:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.703712 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.703784 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.703803 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.703835 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.703859 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:01Z","lastTransitionTime":"2025-09-29T18:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.753014 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:01 crc kubenswrapper[4780]: E0929 18:44:01.753405 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.808103 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.808174 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.808192 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.808220 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.808239 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:01Z","lastTransitionTime":"2025-09-29T18:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.912104 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.912221 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.912240 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.912265 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:01 crc kubenswrapper[4780]: I0929 18:44:01.912282 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:01Z","lastTransitionTime":"2025-09-29T18:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.015742 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.015813 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.015831 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.015859 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.015878 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.119203 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.119583 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.119789 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.120008 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.120290 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.224781 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.225238 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.225454 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.225675 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.225855 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.328785 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.328857 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.328874 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.328897 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.328920 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.431578 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.431645 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.431663 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.431689 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.431708 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.535390 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.535446 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.535467 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.535495 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.535520 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.638796 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.638855 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.638870 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.638892 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.638907 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.741934 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.742018 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.742075 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.742114 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.742138 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.752202 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.752251 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.752395 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:02 crc kubenswrapper[4780]: E0929 18:44:02.752590 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:02 crc kubenswrapper[4780]: E0929 18:44:02.752965 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:02 crc kubenswrapper[4780]: E0929 18:44:02.753159 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.845900 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.845990 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.846010 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.846119 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.846143 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.950629 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.950668 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.950680 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.950697 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:02 crc kubenswrapper[4780]: I0929 18:44:02.950712 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:02Z","lastTransitionTime":"2025-09-29T18:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.054467 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.054529 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.054541 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.054559 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.054569 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.157880 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.157969 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.157988 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.158011 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.158063 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.261880 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.261924 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.261962 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.261981 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.261995 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.366013 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.366106 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.366118 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.366144 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.366159 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.469085 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.469196 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.469217 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.469275 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.469295 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.573476 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.573539 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.573552 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.573573 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.573639 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.677088 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.677162 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.677247 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.677279 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.677300 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.753030 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:03 crc kubenswrapper[4780]: E0929 18:44:03.753230 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.780336 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.780397 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.780415 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.780445 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.780465 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.883429 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.883515 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.883541 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.883576 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.883600 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.987574 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.987635 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.987652 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.987677 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:03 crc kubenswrapper[4780]: I0929 18:44:03.987695 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:03Z","lastTransitionTime":"2025-09-29T18:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.091420 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.091506 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.091525 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.091550 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.091567 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:04Z","lastTransitionTime":"2025-09-29T18:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.194869 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.194955 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.194971 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.194990 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.195001 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:04Z","lastTransitionTime":"2025-09-29T18:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.298131 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.298177 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.298186 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.298203 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.298214 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:04Z","lastTransitionTime":"2025-09-29T18:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.401712 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.401762 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.401773 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.401792 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.401804 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:04Z","lastTransitionTime":"2025-09-29T18:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.504777 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.505172 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.505292 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.505424 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.505650 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:04Z","lastTransitionTime":"2025-09-29T18:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.608733 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.609121 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.609201 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.609301 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.609385 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:04Z","lastTransitionTime":"2025-09-29T18:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.673660 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:04 crc kubenswrapper[4780]: E0929 18:44:04.673957 4780 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:44:04 crc kubenswrapper[4780]: E0929 18:44:04.674145 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs podName:f7b75391-2034-4284-b779-eb7b1e9da774 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:12.674111965 +0000 UTC m=+52.622410219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs") pod "network-metrics-daemon-j6vxr" (UID: "f7b75391-2034-4284-b779-eb7b1e9da774") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.713790 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.713846 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.713858 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.713880 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.713897 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:04Z","lastTransitionTime":"2025-09-29T18:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.752541 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.752574 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:04 crc kubenswrapper[4780]: E0929 18:44:04.752813 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.752873 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:04 crc kubenswrapper[4780]: E0929 18:44:04.753287 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:04 crc kubenswrapper[4780]: E0929 18:44:04.753376 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.817601 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.817650 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.817662 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.817678 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.817688 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:04Z","lastTransitionTime":"2025-09-29T18:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.922481 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.922524 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.922535 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.922552 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:04 crc kubenswrapper[4780]: I0929 18:44:04.922564 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:04Z","lastTransitionTime":"2025-09-29T18:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.025282 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.025340 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.025357 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.025382 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.025397 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.128672 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.128729 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.128742 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.128763 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.128777 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.231947 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.232002 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.232012 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.232031 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.232042 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.335209 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.335259 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.335271 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.335287 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.335297 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.437720 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.437767 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.437781 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.437799 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.437811 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.540609 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.540647 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.540656 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.540673 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.540683 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.644258 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.644299 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.644309 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.644324 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.644334 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.747774 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.747836 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.747851 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.747872 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.747884 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.753070 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:05 crc kubenswrapper[4780]: E0929 18:44:05.753232 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.850957 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.851017 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.851031 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.851077 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.851108 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.953757 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.953800 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.953810 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.953823 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:05 crc kubenswrapper[4780]: I0929 18:44:05.953834 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:05Z","lastTransitionTime":"2025-09-29T18:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.056453 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.056509 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.056522 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.056541 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.056555 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.160125 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.160198 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.160225 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.160260 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.160284 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.264396 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.264440 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.264452 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.264471 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.264481 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.366932 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.366994 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.367003 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.367021 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.367034 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.469854 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.469965 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.469985 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.470016 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.470037 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.573656 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.573720 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.573735 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.573757 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.573771 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.676652 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.676725 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.676742 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.676763 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.676781 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.753213 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.753310 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:06 crc kubenswrapper[4780]: E0929 18:44:06.753425 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.753657 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:06 crc kubenswrapper[4780]: E0929 18:44:06.753765 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:06 crc kubenswrapper[4780]: E0929 18:44:06.754176 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.780078 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.780138 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.780156 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.780181 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.780202 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.884210 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.884342 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.884363 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.884402 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.884422 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.987556 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.987611 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.987626 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.987648 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:06 crc kubenswrapper[4780]: I0929 18:44:06.987661 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:06Z","lastTransitionTime":"2025-09-29T18:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.092144 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.092250 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.092272 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.092297 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.092314 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.194714 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.194786 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.194808 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.194841 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.194873 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.258709 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.258804 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.258823 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.258857 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.258881 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: E0929 18:44:07.278109 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:07Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.284211 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.284278 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.284291 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.284314 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.284329 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: E0929 18:44:07.304841 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:07Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.309833 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.309976 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.310091 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.310187 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.310270 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: E0929 18:44:07.330443 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:07Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.336010 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.336088 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.336107 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.336155 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.336175 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: E0929 18:44:07.352916 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:07Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.359518 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.359808 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.359956 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.360210 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.360395 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: E0929 18:44:07.383420 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:07Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:07 crc kubenswrapper[4780]: E0929 18:44:07.383561 4780 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.385484 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.385526 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.385543 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.385565 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.385581 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.489000 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.489091 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.489102 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.489120 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.489137 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.592911 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.592983 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.593001 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.593028 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.593079 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.696310 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.696346 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.696357 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.696372 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.696383 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.752671 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:07 crc kubenswrapper[4780]: E0929 18:44:07.753923 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.755234 4780 scope.go:117] "RemoveContainer" containerID="9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.800282 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.800746 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.800964 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.801227 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.801435 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.905552 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.905883 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.906132 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.906257 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:07 crc kubenswrapper[4780]: I0929 18:44:07.906536 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:07Z","lastTransitionTime":"2025-09-29T18:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.011098 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.011144 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.011158 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.011176 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.011190 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.114310 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.114357 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.114369 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.114392 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.114405 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.198879 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/1.log" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.202724 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.203490 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.217208 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.217247 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.217259 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.217277 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.217288 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.223507 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.245383 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.264886 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.285733 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.297191 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.310239 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.320096 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.320151 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.320167 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.320190 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.320209 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.325347 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.338197 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.357393 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:54Z\\\",\\\"message\\\":\\\".ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI0929 18:43:54.046886 6220 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:43:54.046484 6220 services_controller.go:443] Built service openshift-service-ca-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.40\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 18:43:54.046950 6220 services_controller.go:444] Built service openshift-service-ca-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0929 18:43:54.046961 6220 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.371774 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.385683 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.400491 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.414217 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.423089 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.423145 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.423158 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.423180 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.423215 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.429997 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.441755 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.454227 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.467228 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:08Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.525946 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.526001 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.526010 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.526028 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.526054 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.629121 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.629187 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.629209 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.629238 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.629256 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.732588 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.732657 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.732676 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.732709 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.732732 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.753006 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.753016 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:08 crc kubenswrapper[4780]: E0929 18:44:08.753238 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.753023 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:08 crc kubenswrapper[4780]: E0929 18:44:08.753391 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:08 crc kubenswrapper[4780]: E0929 18:44:08.753549 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.836955 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.837040 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.837102 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.837141 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.837186 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.941349 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.941451 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.941476 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.941511 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:08 crc kubenswrapper[4780]: I0929 18:44:08.941532 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:08Z","lastTransitionTime":"2025-09-29T18:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.045257 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.045329 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.045347 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.045377 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.045403 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.148533 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.148581 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.148592 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.148613 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.148626 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.209381 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/2.log" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.210254 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/1.log" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.213853 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606" exitCode=1 Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.213929 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.213999 4780 scope.go:117] "RemoveContainer" containerID="9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.215559 4780 scope.go:117] "RemoveContainer" containerID="445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606" Sep 29 18:44:09 crc kubenswrapper[4780]: E0929 18:44:09.215949 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.235455 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.251450 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.251875 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.251946 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.251964 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.251990 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.252010 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.264600 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.279422 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.295011 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.311170 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.326623 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.342534 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.355891 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.355935 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.355948 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.355968 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.355991 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.366607 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.381170 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.395242 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.406374 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.424161 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.451413 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d7f304bfe5d48173b8fadb051ac6c32029943d10ef18e48bdcf0f4087b492dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:43:54Z\\\",\\\"message\\\":\\\".ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI0929 18:43:54.046886 6220 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:43:54.046484 6220 services_controller.go:443] Built service openshift-service-ca-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.40\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0929 18:43:54.046950 6220 services_controller.go:444] Built service openshift-service-ca-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0929 18:43:54.046961 6220 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.458789 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.458865 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.458881 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.458903 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.458922 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.469472 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.491558 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.512520 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:09Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.562341 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.562381 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.562391 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.562410 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.562423 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.666918 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.667002 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.667028 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.667099 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.667126 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.752313 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:09 crc kubenswrapper[4780]: E0929 18:44:09.752500 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.770179 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.770241 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.770252 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.770268 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.770279 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.873831 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.873920 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.873945 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.873977 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.874000 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.977858 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.977999 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.978019 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.978068 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:09 crc kubenswrapper[4780]: I0929 18:44:09.978088 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:09Z","lastTransitionTime":"2025-09-29T18:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.081094 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.081136 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.081148 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.081167 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.081182 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:10Z","lastTransitionTime":"2025-09-29T18:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.184438 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.184511 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.184531 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.184555 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.184575 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:10Z","lastTransitionTime":"2025-09-29T18:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.220343 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/2.log" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.224790 4780 scope.go:117] "RemoveContainer" containerID="445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606" Sep 29 18:44:10 crc kubenswrapper[4780]: E0929 18:44:10.224993 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.240341 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.266142 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.287861 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.287951 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.287968 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.287990 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.288007 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:10Z","lastTransitionTime":"2025-09-29T18:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.295695 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.312456 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.330328 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.350657 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.373555 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.390508 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.390562 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.390582 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.390605 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.390622 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:10Z","lastTransitionTime":"2025-09-29T18:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.394995 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.410995 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.426890 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.449526 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.468335 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.493456 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.493544 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.493562 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.493588 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.493606 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:10Z","lastTransitionTime":"2025-09-29T18:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.506150 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.522464 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.534227 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.546680 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.561400 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.596117 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.596763 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.596818 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.596848 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.596864 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:10Z","lastTransitionTime":"2025-09-29T18:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.699274 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.699319 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.699328 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.699345 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.699357 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:10Z","lastTransitionTime":"2025-09-29T18:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.752788 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.752789 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:10 crc kubenswrapper[4780]: E0929 18:44:10.752946 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:10 crc kubenswrapper[4780]: E0929 18:44:10.753007 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.752815 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:10 crc kubenswrapper[4780]: E0929 18:44:10.753121 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.764280 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.774844 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.785648 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.803516 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.803582 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.803605 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.803636 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.803660 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:10Z","lastTransitionTime":"2025-09-29T18:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.804801 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.825147 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.840656 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.854099 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.879450 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.897017 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.906245 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.906643 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.906875 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.906993 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.907099 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:10Z","lastTransitionTime":"2025-09-29T18:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.915552 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.933436 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.947339 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.968971 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:10 crc kubenswrapper[4780]: I0929 18:44:10.986690 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.001588 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:10Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.009674 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.009994 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.010116 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.010190 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.010256 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.034573 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:11Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.051010 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:11Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.113557 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.113992 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.114005 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.114092 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.114113 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.217317 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.217398 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.217418 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.217451 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.217473 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.320591 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.320668 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.320686 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.320716 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.320736 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.424065 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.424125 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.424142 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.424162 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.424174 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.527908 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.527962 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.527979 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.528006 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.528024 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.632570 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.632719 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.632738 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.632767 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.632787 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.736332 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.736417 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.736447 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.736480 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.736506 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.752766 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:11 crc kubenswrapper[4780]: E0929 18:44:11.753218 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.840531 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.840600 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.840617 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.840645 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.840665 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.944428 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.944511 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.944534 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.944750 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:11 crc kubenswrapper[4780]: I0929 18:44:11.944773 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:11Z","lastTransitionTime":"2025-09-29T18:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.048582 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.048638 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.048652 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.048671 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.048685 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.153099 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.153199 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.153224 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.153262 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.153288 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.257147 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.257202 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.257215 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.257236 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.257249 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.361107 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.361176 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.361187 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.361211 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.361223 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.465355 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.465450 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.465464 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.465487 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.465537 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.469393 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.469562 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469600 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:44:44.469561445 +0000 UTC m=+84.417859499 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.469632 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.469694 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469701 4780 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.469750 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469808 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:44.469778621 +0000 UTC m=+84.418076745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469878 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469908 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469926 4780 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469927 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469956 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469977 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:44.469964226 +0000 UTC m=+84.418262360 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.469979 4780 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.470027 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:44.470017968 +0000 UTC m=+84.418316022 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.470086 4780 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.470160 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:44.470138841 +0000 UTC m=+84.418436925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.569836 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.569922 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.569943 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.569970 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.569986 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.673848 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.673924 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.673942 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.673971 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.673992 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.752897 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.752969 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.753098 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.753114 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.753323 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.753508 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.773930 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.774181 4780 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: E0929 18:44:12.774282 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs podName:f7b75391-2034-4284-b779-eb7b1e9da774 nodeName:}" failed. No retries permitted until 2025-09-29 18:44:28.774259787 +0000 UTC m=+68.722557831 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs") pod "network-metrics-daemon-j6vxr" (UID: "f7b75391-2034-4284-b779-eb7b1e9da774") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.776645 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.776716 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.776734 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.776763 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.776783 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.879796 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.879882 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.879902 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.879935 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.879955 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.984003 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.984119 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.984154 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.984187 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:12 crc kubenswrapper[4780]: I0929 18:44:12.984207 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:12Z","lastTransitionTime":"2025-09-29T18:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.088463 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.088525 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.088543 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.088571 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.088594 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:13Z","lastTransitionTime":"2025-09-29T18:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.196436 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.196498 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.196510 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.196538 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.196554 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:13Z","lastTransitionTime":"2025-09-29T18:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.299159 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.299341 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.299401 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.299472 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.299632 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:13Z","lastTransitionTime":"2025-09-29T18:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.402985 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.403035 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.403058 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.403080 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.403091 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:13Z","lastTransitionTime":"2025-09-29T18:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.506495 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.506556 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.506570 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.506590 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.506604 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:13Z","lastTransitionTime":"2025-09-29T18:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.610499 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.610550 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.610565 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.610588 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.610625 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:13Z","lastTransitionTime":"2025-09-29T18:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.713809 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.713867 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.713883 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.713907 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.713921 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:13Z","lastTransitionTime":"2025-09-29T18:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.752696 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:13 crc kubenswrapper[4780]: E0929 18:44:13.753020 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.818466 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.818547 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.818572 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.818611 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.818637 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:13Z","lastTransitionTime":"2025-09-29T18:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.921717 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.921787 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.921810 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.921886 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:13 crc kubenswrapper[4780]: I0929 18:44:13.921912 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:13Z","lastTransitionTime":"2025-09-29T18:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.025490 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.025560 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.025577 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.025607 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.025658 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.128662 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.128723 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.128744 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.128770 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.128787 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.231333 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.231691 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.231797 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.231890 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.231969 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.335798 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.335932 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.335963 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.335996 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.336019 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.358406 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.376400 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.383281 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.398796 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.415324 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.433594 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.438876 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.439273 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.439458 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.439623 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.439727 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.459071 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.480505 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.500305 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.516232 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.530642 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.542251 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.542299 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.542309 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.542331 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.542346 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.552199 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.568401 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.585690 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.604905 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.623597 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.638191 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.644849 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.644915 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.644925 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.644945 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.644955 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.653241 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.668390 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:14Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.747840 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.747908 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.747931 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.747957 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.747973 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.752281 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:14 crc kubenswrapper[4780]: E0929 18:44:14.752420 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.752481 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.752593 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:14 crc kubenswrapper[4780]: E0929 18:44:14.752854 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:14 crc kubenswrapper[4780]: E0929 18:44:14.753017 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.851011 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.851109 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.851127 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.851153 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.851172 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.954850 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.954915 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.954932 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.954961 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:14 crc kubenswrapper[4780]: I0929 18:44:14.954978 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:14Z","lastTransitionTime":"2025-09-29T18:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.059495 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.059556 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.059573 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.059597 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.059617 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.163246 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.163314 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.163333 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.163357 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.163382 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.268118 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.268218 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.268238 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.268270 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.268289 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.372633 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.372716 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.372736 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.372760 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.372777 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.477134 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.477208 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.477230 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.477258 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.477281 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.580843 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.580911 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.580982 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.581010 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.581030 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.683723 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.683795 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.683814 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.683838 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.683856 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.752377 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:15 crc kubenswrapper[4780]: E0929 18:44:15.752573 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.786687 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.786746 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.786758 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.786784 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.786797 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.890641 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.890718 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.890735 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.890765 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.890782 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.994289 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.994329 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.994342 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.994361 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:15 crc kubenswrapper[4780]: I0929 18:44:15.994373 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:15Z","lastTransitionTime":"2025-09-29T18:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.098235 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.098311 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.098335 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.098366 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.098392 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:16Z","lastTransitionTime":"2025-09-29T18:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.201277 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.201328 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.201347 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.201372 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.201401 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:16Z","lastTransitionTime":"2025-09-29T18:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.305435 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.305509 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.305527 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.305562 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.305584 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:16Z","lastTransitionTime":"2025-09-29T18:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.408200 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.408533 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.408703 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.408923 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.409112 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:16Z","lastTransitionTime":"2025-09-29T18:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.513265 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.513371 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.513397 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.513431 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.513455 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:16Z","lastTransitionTime":"2025-09-29T18:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.616538 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.616889 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.617030 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.617355 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.617509 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:16Z","lastTransitionTime":"2025-09-29T18:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.720888 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.720953 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.720978 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.721010 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.721029 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:16Z","lastTransitionTime":"2025-09-29T18:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.752567 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.752628 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.752719 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:16 crc kubenswrapper[4780]: E0929 18:44:16.752784 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:16 crc kubenswrapper[4780]: E0929 18:44:16.752875 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:16 crc kubenswrapper[4780]: E0929 18:44:16.752991 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.824772 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.824832 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.824843 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.824867 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.824880 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:16Z","lastTransitionTime":"2025-09-29T18:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.928340 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.928432 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.928451 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.928481 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:16 crc kubenswrapper[4780]: I0929 18:44:16.928503 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:16Z","lastTransitionTime":"2025-09-29T18:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.031838 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.031915 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.031935 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.031962 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.031982 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.135894 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.135969 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.135988 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.136017 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.136035 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.240376 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.240468 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.240491 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.240535 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.240558 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.343927 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.343978 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.343987 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.344006 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.344016 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.447734 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.447793 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.447805 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.447824 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.447839 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.556602 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.556669 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.556688 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.556715 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.556734 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.659671 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.659913 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.660042 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.660223 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.660342 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.695811 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.695876 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.695893 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.695913 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.695929 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: E0929 18:44:17.717366 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:17Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.723368 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.723462 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.723486 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.723520 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.723540 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: E0929 18:44:17.745914 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:17Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.751231 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.751306 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.751323 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.751352 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.751372 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.752154 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:17 crc kubenswrapper[4780]: E0929 18:44:17.752327 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:17 crc kubenswrapper[4780]: E0929 18:44:17.771764 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:17Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.777568 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.777613 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.777623 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.777646 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.777657 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: E0929 18:44:17.801006 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:17Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.807297 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.807371 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.807398 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.807443 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.807463 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: E0929 18:44:17.827267 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:17Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:17 crc kubenswrapper[4780]: E0929 18:44:17.827552 4780 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.830547 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.830621 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.830643 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.830679 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.830702 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.934192 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.934273 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.934294 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.934323 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:17 crc kubenswrapper[4780]: I0929 18:44:17.934341 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:17Z","lastTransitionTime":"2025-09-29T18:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.037368 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.037418 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.037427 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.037444 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.037454 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.140397 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.140476 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.140505 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.140531 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.140554 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.245382 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.245678 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.245710 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.245745 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.245769 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.349695 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.349764 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.349792 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.349827 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.349851 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.453863 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.453926 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.453944 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.453972 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.453991 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.556787 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.556840 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.556852 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.556875 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.556890 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.660443 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.660492 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.660502 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.660519 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.660530 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.752843 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.752888 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.752932 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:18 crc kubenswrapper[4780]: E0929 18:44:18.753032 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:18 crc kubenswrapper[4780]: E0929 18:44:18.753284 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:18 crc kubenswrapper[4780]: E0929 18:44:18.753446 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.763637 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.763710 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.763725 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.763748 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.763764 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.866979 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.867072 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.867088 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.867112 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.867128 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.971068 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.971148 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.971172 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.971206 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:18 crc kubenswrapper[4780]: I0929 18:44:18.971231 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:18Z","lastTransitionTime":"2025-09-29T18:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.074938 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.074997 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.075010 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.075032 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.075114 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:19Z","lastTransitionTime":"2025-09-29T18:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.178475 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.178536 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.178548 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.178569 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.178583 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:19Z","lastTransitionTime":"2025-09-29T18:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.281926 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.282010 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.282038 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.282115 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.282143 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:19Z","lastTransitionTime":"2025-09-29T18:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.385370 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.385444 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.385458 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.385480 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.385495 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:19Z","lastTransitionTime":"2025-09-29T18:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.489341 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.489421 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.489446 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.489479 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.489498 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:19Z","lastTransitionTime":"2025-09-29T18:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.593018 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.593141 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.593160 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.593183 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.593199 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:19Z","lastTransitionTime":"2025-09-29T18:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.697325 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.697375 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.697386 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.697415 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.697428 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:19Z","lastTransitionTime":"2025-09-29T18:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.752737 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:19 crc kubenswrapper[4780]: E0929 18:44:19.752969 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.801256 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.801331 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.801357 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.801389 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.801413 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:19Z","lastTransitionTime":"2025-09-29T18:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.905219 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.905345 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.905363 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.905396 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:19 crc kubenswrapper[4780]: I0929 18:44:19.905422 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:19Z","lastTransitionTime":"2025-09-29T18:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.009144 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.009224 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.009244 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.009274 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.009295 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.112847 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.112915 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.112933 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.113014 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.113084 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.216151 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.216238 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.216260 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.216294 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.216323 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.319891 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.319944 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.319960 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.319985 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.320004 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.423293 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.423408 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.423428 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.423455 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.423476 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.527844 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.527937 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.527972 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.528004 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.528026 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.631220 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.631326 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.631346 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.631372 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.631393 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.734489 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.734550 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.734567 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.734592 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.734611 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.752474 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.752474 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:20 crc kubenswrapper[4780]: E0929 18:44:20.752854 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.753006 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:20 crc kubenswrapper[4780]: E0929 18:44:20.753135 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:20 crc kubenswrapper[4780]: E0929 18:44:20.753280 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.784009 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.809170 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.826197 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.838259 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.838322 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.838341 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.838367 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.838386 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.848484 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.869476 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.892361 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.905389 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.924678 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.941341 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.942291 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.942346 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.942361 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.942382 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.942397 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:20Z","lastTransitionTime":"2025-09-29T18:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.954547 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.969864 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:20 crc kubenswrapper[4780]: I0929 18:44:20.985564 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:20Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.010490 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:21Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.029484 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641fc2cd-3763-40a9-a61f-ab4570912da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:21Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.046206 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:21Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.046349 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.046712 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.046723 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.046739 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.046754 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.064889 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:21Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.080951 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:21Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.097371 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:21Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.150403 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.150492 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.150519 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.150555 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.150580 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.253448 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.253515 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.253527 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.253545 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.253576 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.356486 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.356543 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.356556 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.356575 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.356590 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.459804 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.459852 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.459861 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.459879 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.460115 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.567922 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.567968 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.567977 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.567991 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.568004 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.670927 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.671272 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.671391 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.671488 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.671588 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.752428 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:21 crc kubenswrapper[4780]: E0929 18:44:21.753101 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.775577 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.775919 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.775985 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.776085 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.776160 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.879690 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.879737 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.879751 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.879771 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.879783 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.983206 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.983258 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.983267 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.983286 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:21 crc kubenswrapper[4780]: I0929 18:44:21.983297 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:21Z","lastTransitionTime":"2025-09-29T18:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.085956 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.086000 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.086012 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.086030 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.086060 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:22Z","lastTransitionTime":"2025-09-29T18:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.189467 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.189985 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.190245 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.190453 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.190654 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:22Z","lastTransitionTime":"2025-09-29T18:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.294420 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.294494 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.294516 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.294546 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.294568 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:22Z","lastTransitionTime":"2025-09-29T18:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.397341 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.397410 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.397432 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.397460 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.397481 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:22Z","lastTransitionTime":"2025-09-29T18:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.537386 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.537482 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.537499 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.537524 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.537539 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:22Z","lastTransitionTime":"2025-09-29T18:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.640936 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.641385 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.641493 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.641586 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.641678 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:22Z","lastTransitionTime":"2025-09-29T18:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.745606 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.745749 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.745769 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.745796 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.745817 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:22Z","lastTransitionTime":"2025-09-29T18:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.752008 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.752217 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:22 crc kubenswrapper[4780]: E0929 18:44:22.752446 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.752503 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:22 crc kubenswrapper[4780]: E0929 18:44:22.752686 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:22 crc kubenswrapper[4780]: E0929 18:44:22.752775 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.756645 4780 scope.go:117] "RemoveContainer" containerID="445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606" Sep 29 18:44:22 crc kubenswrapper[4780]: E0929 18:44:22.757676 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.848834 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.848911 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.848932 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.848959 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.848975 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:22Z","lastTransitionTime":"2025-09-29T18:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.952492 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.952567 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.952586 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.952619 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:22 crc kubenswrapper[4780]: I0929 18:44:22.952637 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:22Z","lastTransitionTime":"2025-09-29T18:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.055684 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.055765 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.055784 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.055813 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.055837 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.158619 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.158711 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.158786 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.158824 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.158849 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.263000 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.263130 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.263146 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.263174 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.263188 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.366455 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.366755 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.366885 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.366991 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.367096 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.470216 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.470549 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.470728 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.470997 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.471172 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.574462 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.574535 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.574553 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.574579 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.574598 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.677601 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.677671 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.677695 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.677721 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.677737 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.752480 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:23 crc kubenswrapper[4780]: E0929 18:44:23.752758 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.780542 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.780845 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.780956 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.781035 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.781213 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.887930 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.888313 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.888396 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.888470 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.888539 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.991835 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.992221 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.992342 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.992473 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:23 crc kubenswrapper[4780]: I0929 18:44:23.992596 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:23Z","lastTransitionTime":"2025-09-29T18:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.096079 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.096143 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.096154 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.096173 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.096188 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:24Z","lastTransitionTime":"2025-09-29T18:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.199999 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.200596 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.200780 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.200932 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.201158 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:24Z","lastTransitionTime":"2025-09-29T18:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.303443 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.303736 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.303894 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.304026 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.304173 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:24Z","lastTransitionTime":"2025-09-29T18:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.407499 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.407580 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.407603 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.407633 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.407654 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:24Z","lastTransitionTime":"2025-09-29T18:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.511915 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.512434 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.512687 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.512849 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.512999 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:24Z","lastTransitionTime":"2025-09-29T18:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.616925 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.616984 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.617003 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.617027 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.617088 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:24Z","lastTransitionTime":"2025-09-29T18:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.721392 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.721468 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.721489 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.721518 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.721537 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:24Z","lastTransitionTime":"2025-09-29T18:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.752443 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.752546 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.752535 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:24 crc kubenswrapper[4780]: E0929 18:44:24.752668 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:24 crc kubenswrapper[4780]: E0929 18:44:24.753135 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:24 crc kubenswrapper[4780]: E0929 18:44:24.753230 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.824130 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.824163 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.824172 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.824185 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.824195 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:24Z","lastTransitionTime":"2025-09-29T18:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.927069 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.927110 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.927123 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.927139 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:24 crc kubenswrapper[4780]: I0929 18:44:24.927151 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:24Z","lastTransitionTime":"2025-09-29T18:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.030338 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.030401 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.030414 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.030436 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.030450 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.134293 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.134370 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.134393 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.134423 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.134444 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.237923 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.237970 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.237982 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.238002 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.238014 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.340718 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.340801 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.340822 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.340852 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.340877 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.443886 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.443920 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.443955 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.443970 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.443980 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.547391 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.547443 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.547456 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.547476 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.547492 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.650828 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.650883 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.650893 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.650914 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.650926 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.752145 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:25 crc kubenswrapper[4780]: E0929 18:44:25.752303 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.754078 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.754114 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.754124 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.754138 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.754148 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.857090 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.857151 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.857169 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.857191 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.857211 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.959871 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.959941 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.959960 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.959984 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:25 crc kubenswrapper[4780]: I0929 18:44:25.960073 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:25Z","lastTransitionTime":"2025-09-29T18:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.062582 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.062632 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.062649 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.062680 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.062698 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.165445 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.165509 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.165538 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.165570 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.165597 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.267846 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.267882 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.267897 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.267915 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.267927 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.370741 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.370780 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.370794 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.370815 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.370827 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.473869 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.473913 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.473928 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.473945 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.473962 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.578261 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.578330 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.578350 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.578382 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.578411 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.681809 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.681874 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.681887 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.681910 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.681923 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.753001 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.753134 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.753253 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:26 crc kubenswrapper[4780]: E0929 18:44:26.753485 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:26 crc kubenswrapper[4780]: E0929 18:44:26.753652 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:26 crc kubenswrapper[4780]: E0929 18:44:26.753841 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.784790 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.784849 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.784864 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.784885 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.784905 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.887745 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.887785 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.887799 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.887818 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.887828 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.991118 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.991210 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.991237 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.991271 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:26 crc kubenswrapper[4780]: I0929 18:44:26.991293 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:26Z","lastTransitionTime":"2025-09-29T18:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.093973 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.094036 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.094076 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.094101 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.094116 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.197940 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.198016 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.198035 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.198109 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.198129 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.301432 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.301494 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.301505 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.301527 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.301542 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.404641 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.404711 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.404734 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.404768 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.404791 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.508484 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.508552 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.508570 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.508600 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.508619 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.611106 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.611169 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.611185 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.611213 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.611231 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.714172 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.714215 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.714224 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.714242 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.714253 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.752370 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:27 crc kubenswrapper[4780]: E0929 18:44:27.752618 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.817152 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.817239 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.817252 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.817270 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.817287 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.920608 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.920677 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.920689 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.920713 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.920728 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.950755 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.950808 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.950822 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.950844 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.950858 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: E0929 18:44:27.967451 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:27Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.972443 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.972513 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.972532 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.972565 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.972583 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:27 crc kubenswrapper[4780]: E0929 18:44:27.986640 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:27Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.992374 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.992427 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.992436 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.992470 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:27 crc kubenswrapper[4780]: I0929 18:44:27.992482 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:27Z","lastTransitionTime":"2025-09-29T18:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: E0929 18:44:28.010516 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:28Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.015702 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.015791 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.015809 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.015845 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.015861 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: E0929 18:44:28.030793 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:28Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.036919 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.036963 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.036975 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.036996 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.037008 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: E0929 18:44:28.055585 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:28Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:28 crc kubenswrapper[4780]: E0929 18:44:28.055748 4780 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.057457 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.057518 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.057536 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.057567 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.057588 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.161009 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.161128 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.161150 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.161180 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.161255 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.264651 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.264697 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.264709 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.264728 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.264740 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.368136 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.368196 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.368210 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.368232 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.368247 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.471537 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.471577 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.471600 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.471615 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.471627 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.574340 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.574378 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.574387 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.574404 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.574416 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.677078 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.677122 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.677143 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.677181 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.677191 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.752172 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.752288 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:28 crc kubenswrapper[4780]: E0929 18:44:28.752389 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:28 crc kubenswrapper[4780]: E0929 18:44:28.752450 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.752502 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:28 crc kubenswrapper[4780]: E0929 18:44:28.752556 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.780747 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.780801 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.780813 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.780836 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.780850 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.869171 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:28 crc kubenswrapper[4780]: E0929 18:44:28.869428 4780 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:44:28 crc kubenswrapper[4780]: E0929 18:44:28.869552 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs podName:f7b75391-2034-4284-b779-eb7b1e9da774 nodeName:}" failed. No retries permitted until 2025-09-29 18:45:00.869526249 +0000 UTC m=+100.817824293 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs") pod "network-metrics-daemon-j6vxr" (UID: "f7b75391-2034-4284-b779-eb7b1e9da774") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.884072 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.884152 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.884173 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.884204 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.884222 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.986828 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.986886 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.986896 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.986914 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:28 crc kubenswrapper[4780]: I0929 18:44:28.986928 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:28Z","lastTransitionTime":"2025-09-29T18:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.090647 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.090692 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.090703 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.090726 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.090739 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:29Z","lastTransitionTime":"2025-09-29T18:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.193109 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.193160 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.193176 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.193196 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.193206 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:29Z","lastTransitionTime":"2025-09-29T18:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.295162 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.295224 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.295244 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.295268 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.295285 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:29Z","lastTransitionTime":"2025-09-29T18:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.397689 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.397733 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.397753 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.397774 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.397787 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:29Z","lastTransitionTime":"2025-09-29T18:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.500575 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.500630 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.500644 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.500664 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.500675 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:29Z","lastTransitionTime":"2025-09-29T18:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.602902 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.602940 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.602949 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.602963 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.602973 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:29Z","lastTransitionTime":"2025-09-29T18:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.705988 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.706034 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.706065 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.706086 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.706100 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:29Z","lastTransitionTime":"2025-09-29T18:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.752678 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:29 crc kubenswrapper[4780]: E0929 18:44:29.752844 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.808984 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.809040 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.809085 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.809105 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.809118 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:29Z","lastTransitionTime":"2025-09-29T18:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.912142 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.912192 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.912209 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.912236 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:29 crc kubenswrapper[4780]: I0929 18:44:29.912254 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:29Z","lastTransitionTime":"2025-09-29T18:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.014766 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.014821 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.014834 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.014854 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.014867 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.118005 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.118115 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.118135 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.118164 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.118183 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.221804 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.221862 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.221939 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.221977 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.221992 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.296565 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/0.log" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.296627 4780 generic.go:334] "Generic (PLEG): container finished" podID="2c2af9fc-5cef-48e3-8070-cf2767bc4a81" containerID="59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7" exitCode=1 Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.296673 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wc8rf" event={"ID":"2c2af9fc-5cef-48e3-8070-cf2767bc4a81","Type":"ContainerDied","Data":"59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.297226 4780 scope.go:117] "RemoveContainer" containerID="59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.321277 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.327457 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.327503 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.327516 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.327536 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.327549 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.336509 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641fc2cd-3763-40a9-a61f-ab4570912da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.349812 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.364561 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.376099 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.390504 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.403561 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.413347 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.429113 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.431398 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.431670 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.431700 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.432002 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.432021 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.442357 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.454675 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:29Z\\\",\\\"message\\\":\\\"2025-09-29T18:43:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70\\\\n2025-09-29T18:43:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70 to /host/opt/cni/bin/\\\\n2025-09-29T18:43:44Z [verbose] multus-daemon started\\\\n2025-09-29T18:43:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T18:44:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.466660 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.486489 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.500219 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.511511 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.520955 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.529947 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.534480 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.534531 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.534540 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.534559 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.534568 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.541422 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.637576 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.637625 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.637633 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.637649 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.637660 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.740162 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.740219 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.740231 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.740256 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.740271 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.752915 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.752915 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:30 crc kubenswrapper[4780]: E0929 18:44:30.753167 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:30 crc kubenswrapper[4780]: E0929 18:44:30.753318 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.753977 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:30 crc kubenswrapper[4780]: E0929 18:44:30.754243 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.772122 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.788721 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.807760 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.823149 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:29Z\\\",\\\"message\\\":\\\"2025-09-29T18:43:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70\\\\n2025-09-29T18:43:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70 to /host/opt/cni/bin/\\\\n2025-09-29T18:43:44Z [verbose] multus-daemon started\\\\n2025-09-29T18:43:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T18:44:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.842137 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.847295 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.847353 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.847365 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.847385 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.847398 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.865806 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.882666 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.895517 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.917397 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.935145 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.951643 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.951893 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.952121 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.952141 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.952168 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.952188 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:30Z","lastTransitionTime":"2025-09-29T18:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.967410 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.983355 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641fc2cd-3763-40a9-a61f-ab4570912da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:30 crc kubenswrapper[4780]: I0929 18:44:30.994334 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:30Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.007196 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.020687 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.036474 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.055014 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.055087 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.055100 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.055124 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.055139 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.060016 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.157609 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.158008 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.158136 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.158228 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.158303 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.261705 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.261756 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.261771 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.261792 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.261807 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.303245 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/0.log" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.303829 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wc8rf" event={"ID":"2c2af9fc-5cef-48e3-8070-cf2767bc4a81","Type":"ContainerStarted","Data":"bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.337484 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.353695 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.364370 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.364437 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.364456 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.364478 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.364492 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.368962 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.388733 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:29Z\\\",\\\"message\\\":\\\"2025-09-29T18:43:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70\\\\n2025-09-29T18:43:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70 to /host/opt/cni/bin/\\\\n2025-09-29T18:43:44Z [verbose] multus-daemon started\\\\n2025-09-29T18:43:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T18:44:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.402719 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.416955 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.428984 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.448327 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.462977 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.467408 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.467465 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.467486 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.467513 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.467534 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.480509 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.493496 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.510624 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.534870 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.548078 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641fc2cd-3763-40a9-a61f-ab4570912da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.560797 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.570336 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.570373 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.570385 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.570405 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.570418 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.580562 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.594176 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.604074 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:31Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.674621 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.674695 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.674719 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.674752 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.674774 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.752809 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:31 crc kubenswrapper[4780]: E0929 18:44:31.753014 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.778851 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.778919 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.779138 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.779165 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.779184 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.882092 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.882132 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.882147 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.882173 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.882189 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.985627 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.985708 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.985726 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.985752 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:31 crc kubenswrapper[4780]: I0929 18:44:31.985769 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:31Z","lastTransitionTime":"2025-09-29T18:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.088821 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.088872 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.088883 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.088903 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.088921 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:32Z","lastTransitionTime":"2025-09-29T18:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.191435 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.191480 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.191517 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.191534 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.191548 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:32Z","lastTransitionTime":"2025-09-29T18:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.295431 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.295486 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.295495 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.295515 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.295528 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:32Z","lastTransitionTime":"2025-09-29T18:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.398653 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.398697 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.398708 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.398725 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.398736 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:32Z","lastTransitionTime":"2025-09-29T18:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.501994 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.502029 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.502037 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.502071 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.502081 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:32Z","lastTransitionTime":"2025-09-29T18:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.605660 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.605711 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.605723 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.605744 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.605759 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:32Z","lastTransitionTime":"2025-09-29T18:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.708347 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.708418 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.708434 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.708460 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.708478 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:32Z","lastTransitionTime":"2025-09-29T18:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.753244 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.753274 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.753420 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:32 crc kubenswrapper[4780]: E0929 18:44:32.753617 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:32 crc kubenswrapper[4780]: E0929 18:44:32.753991 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:32 crc kubenswrapper[4780]: E0929 18:44:32.754124 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.770893 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.812107 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.812145 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.812157 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.812176 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.812190 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:32Z","lastTransitionTime":"2025-09-29T18:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.914276 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.914317 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.914329 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.914347 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:32 crc kubenswrapper[4780]: I0929 18:44:32.914360 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:32Z","lastTransitionTime":"2025-09-29T18:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.016953 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.016990 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.017001 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.017018 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.017029 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.119346 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.119398 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.119411 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.119431 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.119444 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.222602 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.222688 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.222702 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.222747 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.222764 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.325554 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.325602 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.325613 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.325631 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.325645 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.428201 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.428255 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.428267 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.428288 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.428307 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.530893 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.531274 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.531365 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.531449 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.531556 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.634536 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.634983 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.635219 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.635382 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.635576 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.739301 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.739355 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.739366 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.739386 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.739399 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.752637 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:33 crc kubenswrapper[4780]: E0929 18:44:33.753129 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.843303 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.843388 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.843414 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.843450 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.843472 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.946727 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.947089 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.947218 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.947313 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:33 crc kubenswrapper[4780]: I0929 18:44:33.947394 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:33Z","lastTransitionTime":"2025-09-29T18:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.049912 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.050256 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.050469 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.050703 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.050908 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.153665 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.153717 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.153733 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.153755 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.153771 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.255716 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.255979 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.256062 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.256145 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.256218 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.359465 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.359525 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.359548 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.359578 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.359598 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.462650 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.462692 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.462704 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.462723 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.462734 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.565845 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.565897 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.565908 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.565928 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.565941 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.673376 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.673430 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.673446 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.673471 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.673488 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.752510 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:34 crc kubenswrapper[4780]: E0929 18:44:34.752742 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.753200 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:34 crc kubenswrapper[4780]: E0929 18:44:34.753363 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.753574 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:34 crc kubenswrapper[4780]: E0929 18:44:34.753883 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.776890 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.776941 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.776952 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.776975 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.776991 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.880624 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.880707 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.880731 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.880758 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.880777 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.983611 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.983694 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.983731 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.983763 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:34 crc kubenswrapper[4780]: I0929 18:44:34.983787 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:34Z","lastTransitionTime":"2025-09-29T18:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.086738 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.086801 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.086814 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.086837 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.086853 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:35Z","lastTransitionTime":"2025-09-29T18:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.189574 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.189665 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.189711 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.189749 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.189773 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:35Z","lastTransitionTime":"2025-09-29T18:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.292755 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.292817 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.292835 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.292857 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.292872 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:35Z","lastTransitionTime":"2025-09-29T18:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.396879 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.397379 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.397486 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.397587 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.397689 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:35Z","lastTransitionTime":"2025-09-29T18:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.501494 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.501578 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.501605 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.501641 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.501667 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:35Z","lastTransitionTime":"2025-09-29T18:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.604486 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.604536 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.604551 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.604572 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.604584 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:35Z","lastTransitionTime":"2025-09-29T18:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.713213 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.713272 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.713290 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.713314 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.713332 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:35Z","lastTransitionTime":"2025-09-29T18:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.752701 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:35 crc kubenswrapper[4780]: E0929 18:44:35.752933 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.816802 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.816869 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.816892 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.816916 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.816930 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:35Z","lastTransitionTime":"2025-09-29T18:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.925687 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.926114 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.926478 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.926723 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:35 crc kubenswrapper[4780]: I0929 18:44:35.926948 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:35Z","lastTransitionTime":"2025-09-29T18:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.030573 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.030653 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.030675 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.030717 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.030748 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.135316 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.135399 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.135419 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.135448 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.135473 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.239016 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.239145 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.239175 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.239295 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.239336 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.343940 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.344008 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.344028 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.344079 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.344101 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.448912 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.448970 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.448988 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.449017 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.449037 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.553297 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.553388 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.553416 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.553456 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.553480 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.656915 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.656997 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.657019 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.657084 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.657104 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.752753 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.752851 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:36 crc kubenswrapper[4780]: E0929 18:44:36.752951 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.752989 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:36 crc kubenswrapper[4780]: E0929 18:44:36.753207 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:36 crc kubenswrapper[4780]: E0929 18:44:36.753347 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.759970 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.760028 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.760039 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.760070 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.760083 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.862374 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.862457 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.862481 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.862512 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.862533 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.965811 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.965861 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.965874 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.965892 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:36 crc kubenswrapper[4780]: I0929 18:44:36.965906 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:36Z","lastTransitionTime":"2025-09-29T18:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.069844 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.069942 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.069964 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.069990 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.070008 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:37Z","lastTransitionTime":"2025-09-29T18:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.172966 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.173029 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.173100 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.173125 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.173145 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:37Z","lastTransitionTime":"2025-09-29T18:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.275891 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.275953 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.275970 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.275993 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.276010 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:37Z","lastTransitionTime":"2025-09-29T18:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.379104 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.379171 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.379189 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.379217 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.379236 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:37Z","lastTransitionTime":"2025-09-29T18:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.482626 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.483019 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.483351 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.483599 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.483812 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:37Z","lastTransitionTime":"2025-09-29T18:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.587838 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.588884 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.589087 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.589257 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.589393 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:37Z","lastTransitionTime":"2025-09-29T18:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.693187 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.693622 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.693693 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.693759 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.693822 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:37Z","lastTransitionTime":"2025-09-29T18:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.752268 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:37 crc kubenswrapper[4780]: E0929 18:44:37.752489 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.753664 4780 scope.go:117] "RemoveContainer" containerID="445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.797616 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.797683 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.797707 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.797739 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.797762 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:37Z","lastTransitionTime":"2025-09-29T18:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.901356 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.901417 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.901437 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.901466 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:37 crc kubenswrapper[4780]: I0929 18:44:37.901487 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:37Z","lastTransitionTime":"2025-09-29T18:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.005494 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.005555 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.005569 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.005595 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.005618 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.108435 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.108484 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.108500 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.108522 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.108536 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.211308 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.211360 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.211370 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.211390 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.211402 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.322712 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.322772 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.322780 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.322797 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.322807 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.336504 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/2.log" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.337616 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.337667 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.337704 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.337724 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.337735 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.340757 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.341681 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:44:38 crc kubenswrapper[4780]: E0929 18:44:38.353835 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.358626 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.358677 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.358690 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.358715 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.358732 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.362395 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.385247 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: E0929 18:44:38.373201 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.392906 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.392948 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.392957 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.392974 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.392985 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.399867 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: E0929 18:44:38.404710 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.409333 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.409526 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.409662 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.409780 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.409890 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.415712 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: E0929 18:44:38.428672 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.429876 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.433206 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.433480 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.433674 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.433844 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.433983 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.447115 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:29Z\\\",\\\"message\\\":\\\"2025-09-29T18:43:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70\\\\n2025-09-29T18:43:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70 to /host/opt/cni/bin/\\\\n2025-09-29T18:43:44Z [verbose] multus-daemon started\\\\n2025-09-29T18:43:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T18:44:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: E0929 18:44:38.451539 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: E0929 18:44:38.451691 4780 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.453905 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.453941 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.453950 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.453967 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.453977 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.461703 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.484585 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.498686 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.517936 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.538397 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.556734 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.557000 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.557019 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.557028 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.557059 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.557069 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.572196 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05e0976-033e-42c0-83fc-ed128d801e8d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbdf88051c4c7ab9f12fffc7dabfbc1ee611f25683d84bd4969b8e2075ec9663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.595542 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.607641 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641fc2cd-3763-40a9-a61f-ab4570912da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.623965 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.638503 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.655805 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.660329 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.660386 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.660404 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.660431 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.660454 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.675886 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.752143 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.752192 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.752192 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:38 crc kubenswrapper[4780]: E0929 18:44:38.752402 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:38 crc kubenswrapper[4780]: E0929 18:44:38.752526 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:38 crc kubenswrapper[4780]: E0929 18:44:38.752627 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.763751 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.764012 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.764110 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.764199 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.764361 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.867338 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.867396 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.867407 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.867426 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.867439 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.971316 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.971390 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.971418 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.971450 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:38 crc kubenswrapper[4780]: I0929 18:44:38.971472 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:38Z","lastTransitionTime":"2025-09-29T18:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.074401 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.074499 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.074531 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.074569 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.074594 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:39Z","lastTransitionTime":"2025-09-29T18:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.178121 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.178164 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.178200 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.178223 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.178234 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:39Z","lastTransitionTime":"2025-09-29T18:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.280922 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.280970 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.280979 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.280993 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.281003 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:39Z","lastTransitionTime":"2025-09-29T18:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.348484 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/3.log" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.349522 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/2.log" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.354138 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" exitCode=1 Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.354212 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.354283 4780 scope.go:117] "RemoveContainer" containerID="445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.361684 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:44:39 crc kubenswrapper[4780]: E0929 18:44:39.364414 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.375441 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.383660 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.383704 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.383716 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.383737 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.383754 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:39Z","lastTransitionTime":"2025-09-29T18:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.398474 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.412494 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.428517 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.487385 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.487422 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.487430 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.487447 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.487457 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:39Z","lastTransitionTime":"2025-09-29T18:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.488900 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://445caebafdbd12e29d34b32c9c4891d72c3c1839eb018624f3f2adadc3076606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:08Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI0929 18:44:08.668093 6415 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0929 18:44:08.668089 6415 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668105 6415 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz in node crc\\\\nI0929 18:44:08.668117 6415 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0929 18:44:08.668113 6415 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-j6vxr before timer (time: 2025-09-29 18:44:09.730474686 +0000 UTC m=+1.673013061): skip\\\\nI0929 18:44:08.668131 6415 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz after 0 failed attempt(s)\\\\nI0929 18:44:08.668133 6415 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0929 18:44:08.668145 6415 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz\\\\nF0929 18:44:08.668166 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"cp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 18:44:38.816614 6791 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.696067ms\\\\nF0929 18:44:38.817002 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z]\\\\nI0929 18:44:38.817017 6791 event.go:377] Event(v1.ObjectRefere\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.510554 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.529921 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.545366 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:29Z\\\",\\\"message\\\":\\\"2025-09-29T18:43:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70\\\\n2025-09-29T18:43:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70 to /host/opt/cni/bin/\\\\n2025-09-29T18:43:44Z [verbose] multus-daemon started\\\\n2025-09-29T18:43:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T18:44:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.556513 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.567551 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.578695 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.589451 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.589487 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.589500 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.589520 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.589534 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:39Z","lastTransitionTime":"2025-09-29T18:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.589954 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.601709 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.617013 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.629392 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.641788 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.652260 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05e0976-033e-42c0-83fc-ed128d801e8d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbdf88051c4c7ab9f12fffc7dabfbc1ee611f25683d84bd4969b8e2075ec9663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.670356 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.685285 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641fc2cd-3763-40a9-a61f-ab4570912da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:39Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.692395 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.692433 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.692447 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.692463 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.692475 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:39Z","lastTransitionTime":"2025-09-29T18:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.752452 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:39 crc kubenswrapper[4780]: E0929 18:44:39.752675 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.795450 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.795494 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.795512 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.795535 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.795553 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:39Z","lastTransitionTime":"2025-09-29T18:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.899097 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.899152 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.899170 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.899197 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:39 crc kubenswrapper[4780]: I0929 18:44:39.899214 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:39Z","lastTransitionTime":"2025-09-29T18:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.002805 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.002859 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.002871 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.002895 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.002908 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.106018 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.106101 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.106121 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.106143 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.106158 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.210424 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.210491 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.210508 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.210536 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.210559 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.314314 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.314354 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.314364 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.314382 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.314392 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.360813 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/3.log" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.366767 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:44:40 crc kubenswrapper[4780]: E0929 18:44:40.367100 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.387267 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.411559 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.417236 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.417298 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.417312 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.417335 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.417350 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.434235 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.459742 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:29Z\\\",\\\"message\\\":\\\"2025-09-29T18:43:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70\\\\n2025-09-29T18:43:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70 to /host/opt/cni/bin/\\\\n2025-09-29T18:43:44Z [verbose] multus-daemon started\\\\n2025-09-29T18:43:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T18:44:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.479728 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.516371 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"cp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 18:44:38.816614 6791 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.696067ms\\\\nF0929 18:44:38.817002 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z]\\\\nI0929 18:44:38.817017 6791 event.go:377] Event(v1.ObjectRefere\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.523902 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.523952 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.523962 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.523980 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.523991 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.538414 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.555851 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.572394 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.587569 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.603856 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.617105 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.626773 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.626822 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.626837 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.626859 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.626874 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.635566 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641fc2cd-3763-40a9-a61f-ab4570912da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.658827 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.679602 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.697463 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.719370 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.730591 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.730660 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.730679 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.730706 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.730728 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.737665 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05e0976-033e-42c0-83fc-ed128d801e8d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbdf88051c4c7ab9f12fffc7dabfbc1ee611f25683d84bd4969b8e2075ec9663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.752190 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.752190 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:40 crc kubenswrapper[4780]: E0929 18:44:40.752343 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.752384 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:40 crc kubenswrapper[4780]: E0929 18:44:40.752467 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:40 crc kubenswrapper[4780]: E0929 18:44:40.752549 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.766713 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.786109 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.801952 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.817331 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.833770 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.834909 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.834979 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.834996 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.835025 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.835050 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.859326 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.879593 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05e0976-033e-42c0-83fc-ed128d801e8d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbdf88051c4c7ab9f12fffc7dabfbc1ee611f25683d84bd4969b8e2075ec9663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.914504 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.930617 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641fc2cd-3763-40a9-a61f-ab4570912da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.944812 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.945075 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.945154 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.945247 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.945312 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:40Z","lastTransitionTime":"2025-09-29T18:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.950858 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.970818 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:40 crc kubenswrapper[4780]: I0929 18:44:40.989771 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:40Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.010276 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.028217 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.045658 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.047868 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.047946 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.047972 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.048002 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.048021 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.062106 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.078302 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.096387 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:29Z\\\",\\\"message\\\":\\\"2025-09-29T18:43:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70\\\\n2025-09-29T18:43:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70 to /host/opt/cni/bin/\\\\n2025-09-29T18:43:44Z [verbose] multus-daemon started\\\\n2025-09-29T18:43:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T18:44:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.108783 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.138414 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"cp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 18:44:38.816614 6791 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.696067ms\\\\nF0929 18:44:38.817002 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z]\\\\nI0929 18:44:38.817017 6791 event.go:377] Event(v1.ObjectRefere\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:41Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.151731 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.151775 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.151786 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.151810 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.151822 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.254433 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.254491 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.254700 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.254724 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.254737 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.357170 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.357248 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.357268 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.357301 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.357323 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.460109 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.460671 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.460763 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.460849 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.461023 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.564921 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.565272 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.565369 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.565467 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.565549 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.669500 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.669584 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.669604 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.669637 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.669663 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.753084 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:41 crc kubenswrapper[4780]: E0929 18:44:41.753319 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.772748 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.772806 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.772825 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.772849 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.772866 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.876794 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.876890 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.876904 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.876953 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.876971 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.980749 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.980819 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.980844 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.980878 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:41 crc kubenswrapper[4780]: I0929 18:44:41.980903 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:41Z","lastTransitionTime":"2025-09-29T18:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.084182 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.084261 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.084284 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.084317 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.084339 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:42Z","lastTransitionTime":"2025-09-29T18:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.188156 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.188244 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.188262 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.188290 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.188305 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:42Z","lastTransitionTime":"2025-09-29T18:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.291448 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.291813 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.291896 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.292240 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.292326 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:42Z","lastTransitionTime":"2025-09-29T18:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.394574 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.394619 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.394631 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.394649 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.394662 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:42Z","lastTransitionTime":"2025-09-29T18:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.498185 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.498231 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.498243 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.498321 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.498340 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:42Z","lastTransitionTime":"2025-09-29T18:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.600862 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.600913 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.600930 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.600948 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.600995 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:42Z","lastTransitionTime":"2025-09-29T18:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.703889 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.704001 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.704015 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.704034 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.704066 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:42Z","lastTransitionTime":"2025-09-29T18:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.752146 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.752186 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.752167 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:42 crc kubenswrapper[4780]: E0929 18:44:42.752302 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:42 crc kubenswrapper[4780]: E0929 18:44:42.752359 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:42 crc kubenswrapper[4780]: E0929 18:44:42.752417 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.806899 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.806971 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.806989 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.807014 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.807034 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:42Z","lastTransitionTime":"2025-09-29T18:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.909413 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.909729 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.909807 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.909895 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:42 crc kubenswrapper[4780]: I0929 18:44:42.909959 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:42Z","lastTransitionTime":"2025-09-29T18:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.014660 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.014759 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.014788 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.014825 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.014861 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.118267 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.118349 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.118373 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.118406 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.118432 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.221591 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.221665 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.221690 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.221725 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.221747 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.325360 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.325432 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.325449 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.325476 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.325496 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.429356 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.429440 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.429457 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.429483 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.429502 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.532667 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.532771 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.532792 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.532824 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.532844 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.635172 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.635214 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.635223 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.635238 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.635247 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.738040 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.738169 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.738191 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.738217 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.738236 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.752813 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:43 crc kubenswrapper[4780]: E0929 18:44:43.752985 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.841919 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.841993 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.842006 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.842026 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.842038 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.945269 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.945327 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.945339 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.945362 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:43 crc kubenswrapper[4780]: I0929 18:44:43.945380 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:43Z","lastTransitionTime":"2025-09-29T18:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.048363 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.048412 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.048425 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.048445 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.048457 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.152246 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.152326 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.152346 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.152377 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.152399 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.255595 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.255656 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.255677 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.255704 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.255723 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.359382 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.359456 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.359479 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.359506 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.359524 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.462655 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.462716 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.462732 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.462757 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.462775 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.546693 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.546818 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.546867 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.546910 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.546879397 +0000 UTC m=+148.495177471 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.546986 4780 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547097 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.547033971 +0000 UTC m=+148.495332055 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.546974 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547202 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547230 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547288 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547302 4780 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.547247 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547248 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547375 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.54735188 +0000 UTC m=+148.495649924 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547382 4780 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547399 4780 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547477 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.547452253 +0000 UTC m=+148.495750477 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.547521 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.547503554 +0000 UTC m=+148.495801838 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.566039 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.566113 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.566136 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.566168 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.566190 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.669404 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.669483 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.669506 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.669532 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.669550 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.752873 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.752876 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.753133 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.753386 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.753247 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:44 crc kubenswrapper[4780]: E0929 18:44:44.753506 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.772877 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.772939 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.772952 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.772972 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.772989 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.876860 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.876914 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.876925 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.876941 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.876954 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.980854 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.980903 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.980912 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.980928 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:44 crc kubenswrapper[4780]: I0929 18:44:44.980939 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:44Z","lastTransitionTime":"2025-09-29T18:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.084315 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.084390 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.084408 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.084435 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.084453 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:45Z","lastTransitionTime":"2025-09-29T18:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.188096 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.188177 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.188207 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.188240 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.188263 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:45Z","lastTransitionTime":"2025-09-29T18:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.291608 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.291708 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.291737 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.291773 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.291794 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:45Z","lastTransitionTime":"2025-09-29T18:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.394253 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.394305 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.394321 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.394347 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.394363 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:45Z","lastTransitionTime":"2025-09-29T18:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.497100 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.497146 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.497173 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.497191 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.497200 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:45Z","lastTransitionTime":"2025-09-29T18:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.600516 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.600568 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.600584 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.600609 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.600627 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:45Z","lastTransitionTime":"2025-09-29T18:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.704517 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.704603 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.704624 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.704652 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.704672 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:45Z","lastTransitionTime":"2025-09-29T18:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.752682 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:45 crc kubenswrapper[4780]: E0929 18:44:45.752909 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.808948 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.809025 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.809042 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.809150 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.809171 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:45Z","lastTransitionTime":"2025-09-29T18:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.914085 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.914134 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.914145 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.914162 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:45 crc kubenswrapper[4780]: I0929 18:44:45.914173 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:45Z","lastTransitionTime":"2025-09-29T18:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.017249 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.017288 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.017298 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.017313 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.017322 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.120024 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.120147 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.120162 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.120180 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.120195 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.223593 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.223662 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.223678 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.223704 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.223725 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.328079 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.328145 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.328159 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.328178 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.328195 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.432876 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.432956 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.432972 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.432994 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.433010 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.535672 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.535714 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.535725 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.535740 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.535752 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.638705 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.638788 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.638808 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.638837 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.638856 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.742467 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.742537 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.742557 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.742584 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.742605 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.752904 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.753010 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:46 crc kubenswrapper[4780]: E0929 18:44:46.753154 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.753238 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:46 crc kubenswrapper[4780]: E0929 18:44:46.753528 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:46 crc kubenswrapper[4780]: E0929 18:44:46.753626 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.845826 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.845862 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.845872 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.845888 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.845898 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.949833 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.949906 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.949924 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.949953 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:46 crc kubenswrapper[4780]: I0929 18:44:46.949975 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:46Z","lastTransitionTime":"2025-09-29T18:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.052087 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.052122 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.052129 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.052144 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.052154 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.155300 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.155377 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.155400 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.155431 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.155455 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.258978 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.259085 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.259105 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.259138 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.259158 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.362410 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.362495 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.362514 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.362540 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.362559 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.466184 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.466246 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.466268 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.466295 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.466314 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.569706 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.569789 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.569817 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.569851 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.569876 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.672780 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.672824 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.672836 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.672855 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.672870 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.752437 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:47 crc kubenswrapper[4780]: E0929 18:44:47.752663 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.775962 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.776005 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.776018 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.776035 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.776068 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.885111 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.885165 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.885179 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.885197 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.885214 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.988502 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.988559 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.988576 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.988598 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:47 crc kubenswrapper[4780]: I0929 18:44:47.988612 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:47Z","lastTransitionTime":"2025-09-29T18:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.092696 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.093123 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.093297 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.093473 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.093644 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.197654 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.198403 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.198815 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.199231 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.199478 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.303479 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.303533 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.303558 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.303586 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.303608 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.406268 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.406344 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.406370 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.406409 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.406436 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.509574 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.509643 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.509661 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.509688 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.509711 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.613191 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.613485 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.613497 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.613519 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.613531 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.717189 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.717257 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.717276 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.717304 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.717323 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.718894 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.718984 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.719013 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.719078 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.719106 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: E0929 18:44:48.743047 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.747859 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.747892 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.747902 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.747920 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.747932 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.752492 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:48 crc kubenswrapper[4780]: E0929 18:44:48.752691 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.752735 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:48 crc kubenswrapper[4780]: E0929 18:44:48.752929 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.752943 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:48 crc kubenswrapper[4780]: E0929 18:44:48.753229 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:48 crc kubenswrapper[4780]: E0929 18:44:48.763658 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.767764 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.767819 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.767838 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.767860 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.767879 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: E0929 18:44:48.791961 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.798209 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.798285 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.798303 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.798331 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.798354 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: E0929 18:44:48.822185 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.827740 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.827968 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.828112 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.828220 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.828319 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: E0929 18:44:48.851330 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bb5e3f8e-349c-4fe8-b6cc-8fe8c6b497f0\\\",\\\"systemUUID\\\":\\\"7e834951-590e-482e-8249-2efa8589f762\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:48Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:48 crc kubenswrapper[4780]: E0929 18:44:48.851462 4780 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.853998 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.854035 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.854066 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.854084 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.854097 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.956161 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.956212 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.956223 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.956242 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:48 crc kubenswrapper[4780]: I0929 18:44:48.956255 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:48Z","lastTransitionTime":"2025-09-29T18:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.059153 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.059203 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.059218 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.059239 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.059254 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.161875 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.161914 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.161923 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.161937 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.161947 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.264968 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.265025 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.265034 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.265059 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.265070 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.368512 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.368561 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.368570 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.368585 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.368597 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.471425 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.471475 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.471485 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.471506 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.471519 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.574742 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.574791 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.574801 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.574816 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.574828 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.677505 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.677550 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.677561 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.677578 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.677588 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.752040 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:49 crc kubenswrapper[4780]: E0929 18:44:49.752200 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.780368 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.780392 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.780401 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.780416 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.780427 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.882624 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.882713 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.882737 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.882766 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.882791 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.985571 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.985655 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.985675 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.985705 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:49 crc kubenswrapper[4780]: I0929 18:44:49.985726 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:49Z","lastTransitionTime":"2025-09-29T18:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.089454 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.089536 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.089556 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.089589 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.089610 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:50Z","lastTransitionTime":"2025-09-29T18:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.193417 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.193481 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.193500 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.193529 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.193549 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:50Z","lastTransitionTime":"2025-09-29T18:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.297850 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.297952 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.297972 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.297998 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.298015 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:50Z","lastTransitionTime":"2025-09-29T18:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.401173 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.401260 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.401286 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.401326 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.401353 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:50Z","lastTransitionTime":"2025-09-29T18:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.505202 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.505315 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.505342 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.505378 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.505407 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:50Z","lastTransitionTime":"2025-09-29T18:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.608882 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.608957 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.608976 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.609004 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.609023 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:50Z","lastTransitionTime":"2025-09-29T18:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.712671 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.712730 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.712748 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.712776 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.712799 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:50Z","lastTransitionTime":"2025-09-29T18:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.753304 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.753329 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.753515 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:50 crc kubenswrapper[4780]: E0929 18:44:50.753891 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:50 crc kubenswrapper[4780]: E0929 18:44:50.754533 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:50 crc kubenswrapper[4780]: E0929 18:44:50.754643 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.779728 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wc8rf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c2af9fc-5cef-48e3-8070-cf2767bc4a81\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:29Z\\\",\\\"message\\\":\\\"2025-09-29T18:43:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70\\\\n2025-09-29T18:43:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_031abd46-3751-40d8-bfc4-a6f9d649ac70 to /host/opt/cni/bin/\\\\n2025-09-29T18:43:44Z [verbose] multus-daemon started\\\\n2025-09-29T18:43:44Z [verbose] Readiness Indicator file check\\\\n2025-09-29T18:44:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzswm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wc8rf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.801886 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67a6d63c-6762-464e-9216-a234506b74db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0e06132f78171dd448be4b49d40bd06e886a2ef664acbbf435125e8f5447f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zf7sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jrs9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.816406 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.816493 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.816518 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.816549 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.816571 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:50Z","lastTransitionTime":"2025-09-29T18:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.835392 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43a328df-2763-44f9-9512-3abb64ef45aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-29T18:44:38Z\\\",\\\"message\\\":\\\"cp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.176:1936: 10.217.4.176:443: 10.217.4.176:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {595f6e90-7cd8-4871-85ab-9519d3c9c3e5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0929 18:44:38.816614 6791 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.696067ms\\\\nF0929 18:44:38.817002 6791 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:38Z is after 2025-08-24T17:21:41Z]\\\\nI0929 18:44:38.817017 6791 event.go:377] Event(v1.ObjectRefere\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:44:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5r2sf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p7vtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.857362 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.876480 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb112a65bf1393c9ce760245063e246a73bd81902915cf4651ce23cc86ad5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.894500 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbcc9dd3-6eaf-4833-92f1-d126a87bbd49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fa9c601961e9c923fb07465158e4628b335405c7a68013c4358481728b5b4e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5ed98b6c8a3fd05970b737c4b73020f6442dced2eb4db92f58505b732f12a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8rwr6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5smhz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.908656 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b75391-2034-4284-b779-eb7b1e9da774\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tcbf5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6vxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.919832 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.919878 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.919890 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.919908 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.919919 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:50Z","lastTransitionTime":"2025-09-29T18:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.927351 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3150b9-1d09-4d54-bc00-d6416a108347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74fc4a264c20a12144cbc80077af98cb8f415b6adf3634a20319241a972d376b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2336023b3faade4ba6f8309b53f016f6aed59f78ed365366fc0cda2161eae20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20da266309733c011b32fb9680ef96c0f3833c6e5260928fede8c6c1a243ad6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d6d0e991d343b90e7ba8419132fa123b623779bc5edad128595af5cd89bce45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.944935 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58bd68b-97a1-4a2b-a772-c6f8a3ea2472\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c948580fbd315a49f9d0849466ec08b0cd70136cc6f0b5c8dfb8960b0f8ab981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2brb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.959916 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641fc2cd-3763-40a9-a61f-ab4570912da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f187f28e128d1ef4b50eef0b49334672297a442c2e8a7f07a506429e3930231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9902f7f2407070ef37cf5ba1d268a69d7be6bf6731181358080c0273719abe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bddcfeda3145320586cf5f37ac411a66ab581a3a15980991ea4cb84ecc59b348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918fcb75d9835667f8c0a3a03f946dfab732d9ba5fd01568a56063fee90e5451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.974964 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:50 crc kubenswrapper[4780]: I0929 18:44:50.992566 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7100181276c443202173f0ba67d8371e0c838f49543c78d70c439ef61bc89443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d65c374e73c481eb8d5595050669bd62b5fdf82b5a99b1e3cfa40cf4b342bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:50Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.006558 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.023787 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.023817 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.023826 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.023844 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.023855 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.029066 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"772477ed-f72b-4cae-9042-d9284309476c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abd330831abf21edb30381d16f5e390a2690e659ae2370163bfe18a3654e6530\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb957fb4fbbbbc837312cf52c76c34e83aa34206102f59233f3e64164d8c47ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c63447d3708a8739dac5ba35ea6b80ac559e1a4832a41b20f674f09d25c81de6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://762ccddb1b77babd37be814bcd2efa30a5cc2088274619177791e6331fca7067\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d2f931ccbf03a428be9e148f8a2254ca6509e6a8283eb5e5dc8bda9712b07b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad64058858f5c6caa9eedce0a1f57a437bd4ecfb00e136a13861d6de43bb3ea4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1e41eb5afdc78e60fd8fbc3532412efb5e1052c784b56b9b285e566ebf2e570\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xbfvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gk8l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.041120 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05e0976-033e-42c0-83fc-ed128d801e8d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbdf88051c4c7ab9f12fffc7dabfbc1ee611f25683d84bd4969b8e2075ec9663\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7812c58b36185203b120da2aa42c6e24e2b928a87d464a33e8089d4461875c22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.064839 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9dcc0d8-d0f0-4a63-b708-f04a874dd7b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9dab5defe2eb48d5c1efc93efbe2f0959574258fca327147295a678dc38a3f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f49e628bf2689267766839a339d6458f12b444dca9071c45bb3d88437c87b3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdb8ef0c05c6579c2b5c9ca28b0afb4f89cc4b6493f535989598b1f938de7439\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62fa854989e220a075d99c1e2e20d70aa64eb25a2247907645b3192189033a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b2b49f88b1a77062724f3763cd6b41b0bb236bd19b7d0f2fb5aaaa0cd010b2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4385d87374fde1f409844f9854f6762f0b4a0b788c54034e77d207f10fe2c445\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c6d1e7b10942eeb42fa3ed6469c290f6c05b6825c8c8fb8dee9733637b3e77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2c737eab0eb8f2ef61177d7304fce73df112482776e1ab14067e4db10d7b49d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.076115 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f8mfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e46edd0-3650-4fbc-8ad6-d29defbd30de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fd8b37cb3b91fe91d3913fcf12e28d38db59db307f9c3216d756b2fefeaff79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjjjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f8mfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.089540 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0749927-91f4-4c72-8b5e-465ff66d82b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c281bfb24e20c64743a0f2822404a418d22db9c9fb10c5fdba18c53b4c3eaa66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f3754c03066c0ae1d1ebeed0704d74d9ef104ddc53d260b047cc9255001147f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30b4edf24e6ff042f49589142e81dee2bfd566b44c38ade350ec7aa05f5e099\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5beb15b9017c9cdcfabba40300a4d6564619d9d53791b8492e260f92c2d8a224\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6812db5877ef804b815cd5aeef82562d7e934fbd5d3c3da5386a76ef0742ec7f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-29T18:43:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0929 18:43:34.428216 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0929 18:43:34.431887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2243814894/tls.crt::/tmp/serving-cert-2243814894/tls.key\\\\\\\"\\\\nI0929 18:43:40.534670 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0929 18:43:40.539194 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0929 18:43:40.539221 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0929 18:43:40.539262 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0929 18:43:40.539272 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0929 18:43:40.556405 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0929 18:43:40.558555 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558622 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0929 18:43:40.558635 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0929 18:43:40.558625 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0929 18:43:40.558656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0929 18:43:40.558670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0929 18:43:40.558681 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0929 18:43:40.560443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4417487dd2a12ef919ddec18baf74c4ce743ea408324ad91259670433faaee70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e5d5caf2ae02ecb7d2c1cbbd73232b5b33c2393124aa385da527a45ee3830e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-29T18:43:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-29T18:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T18:43:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.102803 4780 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-29T18:43:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c655edb79bb97937ed05e40395ec451971d4bae4f0e2417c9fa1b25b513e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-29T18:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-29T18:44:51Z is after 2025-08-24T17:21:41Z" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.126327 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.126360 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.126368 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.126382 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.126394 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.230158 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.230266 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.230280 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.230300 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.230312 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.332282 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.332349 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.332367 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.332390 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.332407 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.436165 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.436233 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.436254 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.436284 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.436301 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.539613 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.539673 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.539689 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.539711 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.539725 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.642528 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.642595 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.642620 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.642659 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.642685 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.746905 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.746972 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.746993 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.747027 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.747082 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.752080 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:51 crc kubenswrapper[4780]: E0929 18:44:51.752333 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.850296 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.850369 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.850428 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.850465 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.850487 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.953917 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.953972 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.953989 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.954015 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:51 crc kubenswrapper[4780]: I0929 18:44:51.954034 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:51Z","lastTransitionTime":"2025-09-29T18:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.056956 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.057404 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.057490 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.057571 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.057648 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.161530 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.161596 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.161614 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.161644 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.161665 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.265034 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.265129 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.265138 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.265155 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.265166 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.370774 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.370857 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.370873 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.370898 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.370924 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.473544 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.473634 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.473654 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.473688 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.473708 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.586570 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.586617 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.586627 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.586647 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.586660 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.689621 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.689693 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.689718 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.689754 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.689780 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.752507 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.752626 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:52 crc kubenswrapper[4780]: E0929 18:44:52.752729 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:52 crc kubenswrapper[4780]: E0929 18:44:52.752851 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.752882 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:52 crc kubenswrapper[4780]: E0929 18:44:52.753836 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.754374 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:44:52 crc kubenswrapper[4780]: E0929 18:44:52.754669 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.792424 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.792467 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.792477 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.792492 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.792503 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.896012 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.896144 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.896170 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.896206 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.896236 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.999830 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.999894 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.999912 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:52 crc kubenswrapper[4780]: I0929 18:44:52.999938 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:52.999964 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:52Z","lastTransitionTime":"2025-09-29T18:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.103254 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.103301 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.103313 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.103332 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.103344 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:53Z","lastTransitionTime":"2025-09-29T18:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.206342 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.206441 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.206463 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.206490 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.206509 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:53Z","lastTransitionTime":"2025-09-29T18:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.309543 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.309611 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.309652 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.309690 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.309716 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:53Z","lastTransitionTime":"2025-09-29T18:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.413199 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.413272 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.413294 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.413329 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.413355 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:53Z","lastTransitionTime":"2025-09-29T18:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.517094 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.517152 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.517167 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.517187 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.517202 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:53Z","lastTransitionTime":"2025-09-29T18:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.620492 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.620596 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.620629 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.620664 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.620684 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:53Z","lastTransitionTime":"2025-09-29T18:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.724517 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.724570 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.724584 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.724603 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.724615 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:53Z","lastTransitionTime":"2025-09-29T18:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.752714 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:53 crc kubenswrapper[4780]: E0929 18:44:53.752928 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.827959 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.828026 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.828081 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.828110 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.828129 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:53Z","lastTransitionTime":"2025-09-29T18:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.931569 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.931988 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.932229 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.932461 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:53 crc kubenswrapper[4780]: I0929 18:44:53.932661 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:53Z","lastTransitionTime":"2025-09-29T18:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.035921 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.035976 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.035987 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.036005 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.036020 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.139392 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.139450 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.139465 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.139492 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.139505 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.243450 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.243494 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.243504 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.243519 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.243530 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.346386 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.346434 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.346449 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.346466 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.346480 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.449292 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.449330 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.449341 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.449360 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.449379 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.558716 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.558784 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.558798 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.558819 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.558834 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.662405 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.662472 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.662491 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.662518 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.662538 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.753383 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.753423 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.753524 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:54 crc kubenswrapper[4780]: E0929 18:44:54.753696 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:54 crc kubenswrapper[4780]: E0929 18:44:54.753946 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:54 crc kubenswrapper[4780]: E0929 18:44:54.754163 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.765617 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.765657 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.765671 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.765691 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.765707 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.868483 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.868546 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.868559 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.868580 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.868592 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.971024 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.971124 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.971141 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.971188 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:54 crc kubenswrapper[4780]: I0929 18:44:54.971202 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:54Z","lastTransitionTime":"2025-09-29T18:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.075006 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.075089 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.075108 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.075135 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.075152 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:55Z","lastTransitionTime":"2025-09-29T18:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.178735 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.178791 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.178805 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.178825 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.178841 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:55Z","lastTransitionTime":"2025-09-29T18:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.282408 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.282452 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.282463 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.282482 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.282496 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:55Z","lastTransitionTime":"2025-09-29T18:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.386291 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.386378 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.386400 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.386432 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.386515 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:55Z","lastTransitionTime":"2025-09-29T18:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.489727 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.489781 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.489798 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.489821 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.489840 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:55Z","lastTransitionTime":"2025-09-29T18:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.592392 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.592517 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.592545 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.592573 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.592592 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:55Z","lastTransitionTime":"2025-09-29T18:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.695533 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.695597 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.695613 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.695637 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.695656 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:55Z","lastTransitionTime":"2025-09-29T18:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.752830 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:55 crc kubenswrapper[4780]: E0929 18:44:55.753101 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.799298 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.799674 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.799860 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.800003 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.800184 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:55Z","lastTransitionTime":"2025-09-29T18:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.903600 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.903696 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.903721 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.904084 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:55 crc kubenswrapper[4780]: I0929 18:44:55.904107 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:55Z","lastTransitionTime":"2025-09-29T18:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.006655 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.006710 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.006727 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.006749 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.006766 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.109683 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.109738 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.109758 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.109785 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.109842 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.212941 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.213003 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.213140 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.213176 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.213197 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.317370 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.317466 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.317485 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.317525 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.317546 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.421758 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.421833 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.421857 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.421889 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.421907 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.525621 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.525683 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.525703 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.525730 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.525752 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.628905 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.628948 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.628957 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.628973 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.628982 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.733440 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.733510 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.733533 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.733565 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.733586 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.755303 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:56 crc kubenswrapper[4780]: E0929 18:44:56.755554 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.756096 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:56 crc kubenswrapper[4780]: E0929 18:44:56.756247 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.756428 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:56 crc kubenswrapper[4780]: E0929 18:44:56.756622 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.836731 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.836790 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.836803 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.836825 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.836840 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.946488 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.946563 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.946589 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.946625 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:56 crc kubenswrapper[4780]: I0929 18:44:56.946652 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:56Z","lastTransitionTime":"2025-09-29T18:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.049845 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.049924 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.049944 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.049971 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.049994 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.153700 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.153754 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.153769 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.153791 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.153808 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.257454 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.257512 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.257522 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.257547 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.257561 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.360813 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.360896 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.360929 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.360959 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.360980 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.464225 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.464295 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.464318 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.464346 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.464368 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.567140 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.567221 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.567241 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.567275 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.567298 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.671149 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.671561 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.671898 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.672175 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.672412 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.752184 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:57 crc kubenswrapper[4780]: E0929 18:44:57.752731 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.776781 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.777154 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.777395 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.777587 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.777750 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.881032 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.881143 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.881165 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.881194 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.881213 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.984308 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.984742 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.984949 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.985257 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:57 crc kubenswrapper[4780]: I0929 18:44:57.985483 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:57Z","lastTransitionTime":"2025-09-29T18:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.089160 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.089219 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.089239 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.089264 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.089283 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.193211 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.193272 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.193288 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.193317 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.193335 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.297167 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.297232 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.297253 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.297285 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.297307 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.401002 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.401117 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.401142 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.401170 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.401187 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.503803 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.503874 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.503892 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.503921 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.503939 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.608331 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.608394 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.608417 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.608447 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.608473 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.711347 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.711415 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.711432 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.711460 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.711478 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.752872 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.752879 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.753372 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:44:58 crc kubenswrapper[4780]: E0929 18:44:58.753519 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:44:58 crc kubenswrapper[4780]: E0929 18:44:58.753775 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:44:58 crc kubenswrapper[4780]: E0929 18:44:58.753943 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.814370 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.814429 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.814447 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.814472 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.814492 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.917895 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.917942 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.917959 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.917986 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.918005 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.985230 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.985307 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.985329 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.985358 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 29 18:44:58 crc kubenswrapper[4780]: I0929 18:44:58.985378 4780 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-29T18:44:58Z","lastTransitionTime":"2025-09-29T18:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.051764 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw"] Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.052813 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.056875 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.056948 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.058788 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.058970 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.067713 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=27.06768287 podStartE2EDuration="27.06768287s" podCreationTimestamp="2025-09-29 18:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.067586838 +0000 UTC m=+99.015884902" watchObservedRunningTime="2025-09-29 18:44:59.06768287 +0000 UTC m=+99.015980914" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.096409 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.096373902 podStartE2EDuration="1m17.096373902s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.095759145 +0000 UTC m=+99.044057229" watchObservedRunningTime="2025-09-29 18:44:59.096373902 +0000 UTC m=+99.044671976" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.108860 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=45.10882898 podStartE2EDuration="45.10882898s" podCreationTimestamp="2025-09-29 18:44:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.108557322 +0000 UTC m=+99.056855396" watchObservedRunningTime="2025-09-29 18:44:59.10882898 +0000 UTC m=+99.057127054" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.196512 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gk8l9" podStartSLOduration=78.196493989 podStartE2EDuration="1m18.196493989s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.179104953 +0000 UTC m=+99.127403037" watchObservedRunningTime="2025-09-29 18:44:59.196493989 +0000 UTC m=+99.144792033" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.212172 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.212141826 podStartE2EDuration="1m19.212141826s" podCreationTimestamp="2025-09-29 18:43:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.211347854 +0000 UTC m=+99.159645908" watchObservedRunningTime="2025-09-29 18:44:59.212141826 +0000 UTC m=+99.160439910" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.229802 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.229876 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.229918 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.229963 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.230018 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.241954 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-f8mfd" podStartSLOduration=78.241934048 podStartE2EDuration="1m18.241934048s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.24129446 +0000 UTC m=+99.189592514" watchObservedRunningTime="2025-09-29 18:44:59.241934048 +0000 UTC m=+99.190232092" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.282571 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wc8rf" podStartSLOduration=78.282549863 podStartE2EDuration="1m18.282549863s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.282392868 +0000 UTC m=+99.230690912" watchObservedRunningTime="2025-09-29 18:44:59.282549863 +0000 UTC m=+99.230847907" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.297899 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podStartSLOduration=78.297875431 podStartE2EDuration="1m18.297875431s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.29784356 +0000 UTC m=+99.246141614" watchObservedRunningTime="2025-09-29 18:44:59.297875431 +0000 UTC m=+99.246173495" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.330940 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.330999 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.331023 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.331059 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.331161 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.331959 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.332144 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.332294 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.341358 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.343394 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.343356991 podStartE2EDuration="1m13.343356991s" podCreationTimestamp="2025-09-29 18:43:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.342333113 +0000 UTC m=+99.290631157" watchObservedRunningTime="2025-09-29 18:44:59.343356991 +0000 UTC m=+99.291655035" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.357240 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pgf7g" podStartSLOduration=78.357219298 podStartE2EDuration="1m18.357219298s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.3565746 +0000 UTC m=+99.304872644" watchObservedRunningTime="2025-09-29 18:44:59.357219298 +0000 UTC m=+99.305517342" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.361538 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ca385e1-71dc-4841-af95-e8f6ab1daa1f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gxxcw\" (UID: \"2ca385e1-71dc-4841-af95-e8f6ab1daa1f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.368624 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.371352 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5smhz" podStartSLOduration=77.371332133 podStartE2EDuration="1m17.371332133s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:44:59.369910763 +0000 UTC m=+99.318208807" watchObservedRunningTime="2025-09-29 18:44:59.371332133 +0000 UTC m=+99.319630177" Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.434187 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" event={"ID":"2ca385e1-71dc-4841-af95-e8f6ab1daa1f","Type":"ContainerStarted","Data":"82e36acaabf4018997d6c3cc65d412ab405b470e5444ba639ba878b69dddb264"} Sep 29 18:44:59 crc kubenswrapper[4780]: I0929 18:44:59.752667 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:44:59 crc kubenswrapper[4780]: E0929 18:44:59.752785 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:00 crc kubenswrapper[4780]: I0929 18:45:00.439790 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" event={"ID":"2ca385e1-71dc-4841-af95-e8f6ab1daa1f","Type":"ContainerStarted","Data":"6800a4f307f9011ff1dedbcd00d16b5e1f64e8fe5f479bde609b2c1695669a47"} Sep 29 18:45:00 crc kubenswrapper[4780]: I0929 18:45:00.455386 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gxxcw" podStartSLOduration=79.455347314 podStartE2EDuration="1m19.455347314s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:00.455030845 +0000 UTC m=+100.403328889" watchObservedRunningTime="2025-09-29 18:45:00.455347314 +0000 UTC m=+100.403645368" Sep 29 18:45:00 crc kubenswrapper[4780]: I0929 18:45:00.752596 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:00 crc kubenswrapper[4780]: I0929 18:45:00.752596 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:00 crc kubenswrapper[4780]: I0929 18:45:00.752701 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:00 crc kubenswrapper[4780]: E0929 18:45:00.753700 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:00 crc kubenswrapper[4780]: E0929 18:45:00.753804 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:00 crc kubenswrapper[4780]: E0929 18:45:00.753904 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:00 crc kubenswrapper[4780]: I0929 18:45:00.952437 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:00 crc kubenswrapper[4780]: E0929 18:45:00.952769 4780 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:45:00 crc kubenswrapper[4780]: E0929 18:45:00.952939 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs podName:f7b75391-2034-4284-b779-eb7b1e9da774 nodeName:}" failed. No retries permitted until 2025-09-29 18:46:04.952907993 +0000 UTC m=+164.901206067 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs") pod "network-metrics-daemon-j6vxr" (UID: "f7b75391-2034-4284-b779-eb7b1e9da774") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 29 18:45:01 crc kubenswrapper[4780]: I0929 18:45:01.752729 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:01 crc kubenswrapper[4780]: E0929 18:45:01.752944 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:02 crc kubenswrapper[4780]: I0929 18:45:02.753079 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:02 crc kubenswrapper[4780]: I0929 18:45:02.753094 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:02 crc kubenswrapper[4780]: I0929 18:45:02.753284 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:02 crc kubenswrapper[4780]: E0929 18:45:02.753521 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:02 crc kubenswrapper[4780]: E0929 18:45:02.753612 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:02 crc kubenswrapper[4780]: E0929 18:45:02.753733 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:03 crc kubenswrapper[4780]: I0929 18:45:03.752290 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:03 crc kubenswrapper[4780]: E0929 18:45:03.752450 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:04 crc kubenswrapper[4780]: I0929 18:45:04.752945 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:04 crc kubenswrapper[4780]: I0929 18:45:04.753029 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:04 crc kubenswrapper[4780]: E0929 18:45:04.753266 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:04 crc kubenswrapper[4780]: E0929 18:45:04.753435 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:04 crc kubenswrapper[4780]: I0929 18:45:04.753870 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:04 crc kubenswrapper[4780]: E0929 18:45:04.754001 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:05 crc kubenswrapper[4780]: I0929 18:45:05.752532 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:05 crc kubenswrapper[4780]: E0929 18:45:05.752789 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:06 crc kubenswrapper[4780]: I0929 18:45:06.752388 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:06 crc kubenswrapper[4780]: E0929 18:45:06.752654 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:06 crc kubenswrapper[4780]: I0929 18:45:06.752944 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:06 crc kubenswrapper[4780]: I0929 18:45:06.753528 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:06 crc kubenswrapper[4780]: E0929 18:45:06.753783 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:06 crc kubenswrapper[4780]: I0929 18:45:06.753864 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:45:06 crc kubenswrapper[4780]: E0929 18:45:06.754075 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:06 crc kubenswrapper[4780]: E0929 18:45:06.754173 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:45:07 crc kubenswrapper[4780]: I0929 18:45:07.752711 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:07 crc kubenswrapper[4780]: E0929 18:45:07.753205 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:08 crc kubenswrapper[4780]: I0929 18:45:08.752421 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:08 crc kubenswrapper[4780]: I0929 18:45:08.752436 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:08 crc kubenswrapper[4780]: E0929 18:45:08.752642 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:08 crc kubenswrapper[4780]: I0929 18:45:08.752706 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:08 crc kubenswrapper[4780]: E0929 18:45:08.752786 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:08 crc kubenswrapper[4780]: E0929 18:45:08.753015 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:09 crc kubenswrapper[4780]: I0929 18:45:09.752276 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:09 crc kubenswrapper[4780]: E0929 18:45:09.752496 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:10 crc kubenswrapper[4780]: I0929 18:45:10.752426 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:10 crc kubenswrapper[4780]: I0929 18:45:10.752597 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:10 crc kubenswrapper[4780]: E0929 18:45:10.753782 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:10 crc kubenswrapper[4780]: I0929 18:45:10.753819 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:10 crc kubenswrapper[4780]: E0929 18:45:10.754002 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:10 crc kubenswrapper[4780]: E0929 18:45:10.754130 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:11 crc kubenswrapper[4780]: I0929 18:45:11.752913 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:11 crc kubenswrapper[4780]: E0929 18:45:11.753094 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:12 crc kubenswrapper[4780]: I0929 18:45:12.752685 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:12 crc kubenswrapper[4780]: I0929 18:45:12.752802 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:12 crc kubenswrapper[4780]: E0929 18:45:12.752886 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:12 crc kubenswrapper[4780]: I0929 18:45:12.752949 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:12 crc kubenswrapper[4780]: E0929 18:45:12.753236 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:12 crc kubenswrapper[4780]: E0929 18:45:12.753372 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:13 crc kubenswrapper[4780]: I0929 18:45:13.752097 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:13 crc kubenswrapper[4780]: E0929 18:45:13.752309 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:14 crc kubenswrapper[4780]: I0929 18:45:14.752232 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:14 crc kubenswrapper[4780]: I0929 18:45:14.752481 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:14 crc kubenswrapper[4780]: E0929 18:45:14.752573 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:14 crc kubenswrapper[4780]: E0929 18:45:14.752751 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:14 crc kubenswrapper[4780]: I0929 18:45:14.752824 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:14 crc kubenswrapper[4780]: E0929 18:45:14.753118 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:15 crc kubenswrapper[4780]: I0929 18:45:15.752513 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:15 crc kubenswrapper[4780]: E0929 18:45:15.752775 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:16 crc kubenswrapper[4780]: I0929 18:45:16.507536 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/1.log" Sep 29 18:45:16 crc kubenswrapper[4780]: I0929 18:45:16.508238 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/0.log" Sep 29 18:45:16 crc kubenswrapper[4780]: I0929 18:45:16.508306 4780 generic.go:334] "Generic (PLEG): container finished" podID="2c2af9fc-5cef-48e3-8070-cf2767bc4a81" containerID="bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d" exitCode=1 Sep 29 18:45:16 crc kubenswrapper[4780]: I0929 18:45:16.508357 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wc8rf" event={"ID":"2c2af9fc-5cef-48e3-8070-cf2767bc4a81","Type":"ContainerDied","Data":"bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d"} Sep 29 18:45:16 crc kubenswrapper[4780]: I0929 18:45:16.508405 4780 scope.go:117] "RemoveContainer" containerID="59fc529e7b8f56c3150f07289d7bcbf9962b2b8867c49ea056d7027a3ecb41b7" Sep 29 18:45:16 crc kubenswrapper[4780]: I0929 18:45:16.509109 4780 scope.go:117] "RemoveContainer" containerID="bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d" Sep 29 18:45:16 crc kubenswrapper[4780]: E0929 18:45:16.509349 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wc8rf_openshift-multus(2c2af9fc-5cef-48e3-8070-cf2767bc4a81)\"" pod="openshift-multus/multus-wc8rf" podUID="2c2af9fc-5cef-48e3-8070-cf2767bc4a81" Sep 29 18:45:16 crc kubenswrapper[4780]: I0929 18:45:16.752255 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:16 crc kubenswrapper[4780]: E0929 18:45:16.752451 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:16 crc kubenswrapper[4780]: I0929 18:45:16.752541 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:16 crc kubenswrapper[4780]: I0929 18:45:16.752594 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:16 crc kubenswrapper[4780]: E0929 18:45:16.752771 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:16 crc kubenswrapper[4780]: E0929 18:45:16.752879 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:17 crc kubenswrapper[4780]: I0929 18:45:17.513680 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/1.log" Sep 29 18:45:17 crc kubenswrapper[4780]: I0929 18:45:17.753174 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:17 crc kubenswrapper[4780]: E0929 18:45:17.753386 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:17 crc kubenswrapper[4780]: I0929 18:45:17.754596 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:45:17 crc kubenswrapper[4780]: E0929 18:45:17.754887 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p7vtr_openshift-ovn-kubernetes(43a328df-2763-44f9-9512-3abb64ef45aa)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" Sep 29 18:45:18 crc kubenswrapper[4780]: I0929 18:45:18.752562 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:18 crc kubenswrapper[4780]: I0929 18:45:18.752567 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:18 crc kubenswrapper[4780]: E0929 18:45:18.752723 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:18 crc kubenswrapper[4780]: I0929 18:45:18.752997 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:18 crc kubenswrapper[4780]: E0929 18:45:18.753315 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:18 crc kubenswrapper[4780]: E0929 18:45:18.753406 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:19 crc kubenswrapper[4780]: I0929 18:45:19.753133 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:19 crc kubenswrapper[4780]: E0929 18:45:19.753440 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:20 crc kubenswrapper[4780]: I0929 18:45:20.752357 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:20 crc kubenswrapper[4780]: I0929 18:45:20.754135 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:20 crc kubenswrapper[4780]: I0929 18:45:20.754193 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:20 crc kubenswrapper[4780]: E0929 18:45:20.754578 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:20 crc kubenswrapper[4780]: E0929 18:45:20.754745 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:20 crc kubenswrapper[4780]: E0929 18:45:20.754892 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:20 crc kubenswrapper[4780]: E0929 18:45:20.755156 4780 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 29 18:45:20 crc kubenswrapper[4780]: E0929 18:45:20.878827 4780 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 18:45:21 crc kubenswrapper[4780]: I0929 18:45:21.753143 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:21 crc kubenswrapper[4780]: E0929 18:45:21.753370 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:22 crc kubenswrapper[4780]: I0929 18:45:22.752705 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:22 crc kubenswrapper[4780]: I0929 18:45:22.752723 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:22 crc kubenswrapper[4780]: E0929 18:45:22.752983 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:22 crc kubenswrapper[4780]: I0929 18:45:22.752723 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:22 crc kubenswrapper[4780]: E0929 18:45:22.753150 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:22 crc kubenswrapper[4780]: E0929 18:45:22.753222 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:23 crc kubenswrapper[4780]: I0929 18:45:23.752870 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:23 crc kubenswrapper[4780]: E0929 18:45:23.753198 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:24 crc kubenswrapper[4780]: I0929 18:45:24.752791 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:24 crc kubenswrapper[4780]: I0929 18:45:24.752835 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:24 crc kubenswrapper[4780]: I0929 18:45:24.752900 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:24 crc kubenswrapper[4780]: E0929 18:45:24.752964 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:24 crc kubenswrapper[4780]: E0929 18:45:24.753189 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:24 crc kubenswrapper[4780]: E0929 18:45:24.753231 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:25 crc kubenswrapper[4780]: I0929 18:45:25.752677 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:25 crc kubenswrapper[4780]: E0929 18:45:25.753642 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:25 crc kubenswrapper[4780]: E0929 18:45:25.880825 4780 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 18:45:26 crc kubenswrapper[4780]: I0929 18:45:26.753082 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:26 crc kubenswrapper[4780]: I0929 18:45:26.753162 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:26 crc kubenswrapper[4780]: I0929 18:45:26.753247 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:26 crc kubenswrapper[4780]: E0929 18:45:26.753346 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:26 crc kubenswrapper[4780]: E0929 18:45:26.753693 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:26 crc kubenswrapper[4780]: E0929 18:45:26.753818 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:27 crc kubenswrapper[4780]: I0929 18:45:27.752441 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:27 crc kubenswrapper[4780]: E0929 18:45:27.752618 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:28 crc kubenswrapper[4780]: I0929 18:45:28.752670 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:28 crc kubenswrapper[4780]: I0929 18:45:28.752908 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:28 crc kubenswrapper[4780]: I0929 18:45:28.753157 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:28 crc kubenswrapper[4780]: E0929 18:45:28.753039 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:28 crc kubenswrapper[4780]: E0929 18:45:28.753401 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:28 crc kubenswrapper[4780]: E0929 18:45:28.753525 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:28 crc kubenswrapper[4780]: I0929 18:45:28.755911 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:45:29 crc kubenswrapper[4780]: I0929 18:45:29.567011 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/3.log" Sep 29 18:45:29 crc kubenswrapper[4780]: I0929 18:45:29.572587 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerStarted","Data":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} Sep 29 18:45:29 crc kubenswrapper[4780]: I0929 18:45:29.573124 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:45:29 crc kubenswrapper[4780]: I0929 18:45:29.602145 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podStartSLOduration=108.602124773 podStartE2EDuration="1m48.602124773s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:29.602078152 +0000 UTC m=+129.550376216" watchObservedRunningTime="2025-09-29 18:45:29.602124773 +0000 UTC m=+129.550422817" Sep 29 18:45:29 crc kubenswrapper[4780]: I0929 18:45:29.752450 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:29 crc kubenswrapper[4780]: E0929 18:45:29.752622 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:29 crc kubenswrapper[4780]: I0929 18:45:29.855961 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j6vxr"] Sep 29 18:45:29 crc kubenswrapper[4780]: I0929 18:45:29.856129 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:29 crc kubenswrapper[4780]: E0929 18:45:29.856267 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:30 crc kubenswrapper[4780]: I0929 18:45:30.753105 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:30 crc kubenswrapper[4780]: I0929 18:45:30.753260 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:30 crc kubenswrapper[4780]: E0929 18:45:30.754505 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:30 crc kubenswrapper[4780]: E0929 18:45:30.754668 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:30 crc kubenswrapper[4780]: E0929 18:45:30.881594 4780 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 18:45:31 crc kubenswrapper[4780]: I0929 18:45:31.752889 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:31 crc kubenswrapper[4780]: I0929 18:45:31.752959 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:31 crc kubenswrapper[4780]: E0929 18:45:31.753150 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:31 crc kubenswrapper[4780]: E0929 18:45:31.753409 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:31 crc kubenswrapper[4780]: I0929 18:45:31.753632 4780 scope.go:117] "RemoveContainer" containerID="bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d" Sep 29 18:45:32 crc kubenswrapper[4780]: I0929 18:45:32.587103 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/1.log" Sep 29 18:45:32 crc kubenswrapper[4780]: I0929 18:45:32.587811 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wc8rf" event={"ID":"2c2af9fc-5cef-48e3-8070-cf2767bc4a81","Type":"ContainerStarted","Data":"9e72eed2874a3197d3024f6117b220b2d4dcab94b6f2a290f9d2866bd48d86fd"} Sep 29 18:45:32 crc kubenswrapper[4780]: I0929 18:45:32.752920 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:32 crc kubenswrapper[4780]: I0929 18:45:32.753030 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:32 crc kubenswrapper[4780]: E0929 18:45:32.753206 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:32 crc kubenswrapper[4780]: E0929 18:45:32.753328 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:33 crc kubenswrapper[4780]: I0929 18:45:33.752631 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:33 crc kubenswrapper[4780]: I0929 18:45:33.752641 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:33 crc kubenswrapper[4780]: E0929 18:45:33.753009 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:33 crc kubenswrapper[4780]: E0929 18:45:33.753333 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:34 crc kubenswrapper[4780]: I0929 18:45:34.752309 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:34 crc kubenswrapper[4780]: I0929 18:45:34.752328 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:34 crc kubenswrapper[4780]: E0929 18:45:34.752562 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 29 18:45:34 crc kubenswrapper[4780]: E0929 18:45:34.752616 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 29 18:45:35 crc kubenswrapper[4780]: I0929 18:45:35.752869 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:35 crc kubenswrapper[4780]: E0929 18:45:35.753127 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 29 18:45:35 crc kubenswrapper[4780]: I0929 18:45:35.754200 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:35 crc kubenswrapper[4780]: E0929 18:45:35.754573 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6vxr" podUID="f7b75391-2034-4284-b779-eb7b1e9da774" Sep 29 18:45:36 crc kubenswrapper[4780]: I0929 18:45:36.752228 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:36 crc kubenswrapper[4780]: I0929 18:45:36.752254 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:36 crc kubenswrapper[4780]: I0929 18:45:36.755767 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 29 18:45:36 crc kubenswrapper[4780]: I0929 18:45:36.756091 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 29 18:45:36 crc kubenswrapper[4780]: I0929 18:45:36.756111 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 29 18:45:36 crc kubenswrapper[4780]: I0929 18:45:36.756382 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 29 18:45:37 crc kubenswrapper[4780]: I0929 18:45:37.752196 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:45:37 crc kubenswrapper[4780]: I0929 18:45:37.752218 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:37 crc kubenswrapper[4780]: I0929 18:45:37.755704 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 29 18:45:37 crc kubenswrapper[4780]: I0929 18:45:37.757033 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.044184 4780 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.090557 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-d4z2z"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.091585 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.097750 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.098146 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.098481 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.098719 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.098894 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vr2qc"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.099611 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.100264 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.100304 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.104960 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.105401 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.105712 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.105939 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.106739 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.110645 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.111701 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.116150 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q2ttt"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.116976 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7v67w"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.117617 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7v67w" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.117821 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.118022 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.118480 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.118559 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.118856 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.119487 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.126331 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.126391 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.127132 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.128060 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.128359 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.128966 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.129837 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.132296 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.132350 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.154678 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.154924 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6qtf"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.155721 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.156192 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.156256 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.156533 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.156639 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.157195 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.158336 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.158364 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.158619 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.158949 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.159313 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.160096 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.160347 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.160505 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.160651 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.163489 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.163778 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.163998 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.164165 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gj4p8"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.166540 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.168916 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.169681 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.170571 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.175333 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.177262 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.177729 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.177965 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.178222 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vr4gw"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.179021 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.181206 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.181260 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.182125 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.182114 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.182603 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.182458 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.182891 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.183128 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.183340 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.186169 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.186950 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.187316 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.187871 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.188032 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.188325 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.188500 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.188553 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.189141 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.189273 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.189422 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.189566 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.189689 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.190192 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.190359 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.190505 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.190592 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.190701 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.190785 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.190862 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.190934 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.190999 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.191089 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.191171 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.192091 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.193379 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.193465 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.197644 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.197909 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.198466 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.199384 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-prx87"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.200153 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.201378 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.201621 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.201670 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8c67f"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.202003 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.202816 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.218760 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.218799 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.219286 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.219373 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.220694 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.224384 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.224873 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.225027 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.225186 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.225373 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.226146 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.259432 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.259890 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.262607 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.262680 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.264270 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.264517 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265290 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c169409-7ddd-4961-b837-847550878691-config\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265323 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kthzc\" (UniqueName: \"kubernetes.io/projected/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-kube-api-access-kthzc\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265342 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265359 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4px89\" (UniqueName: \"kubernetes.io/projected/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-kube-api-access-4px89\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265373 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265389 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7506da-aefc-4178-b6a2-408e686c8040-serving-cert\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265404 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265421 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-node-pullsecrets\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265436 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-config\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265454 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265469 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-config\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265485 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-689rt\" (UniqueName: \"kubernetes.io/projected/8f7506da-aefc-4178-b6a2-408e686c8040-kube-api-access-689rt\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265498 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265514 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-etcd-serving-ca\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265530 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-dir\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265547 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-etcd-client\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265562 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-service-ca-bundle\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265589 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp57l\" (UniqueName: \"kubernetes.io/projected/0c169409-7ddd-4961-b837-847550878691-kube-api-access-cp57l\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265605 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-audit\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265621 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13048f18-9e41-4649-a688-311a34f74222-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8jj99\" (UID: \"13048f18-9e41-4649-a688-311a34f74222\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265637 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-config\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265652 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b540596-75a7-4dd2-9466-758942da4d0d-encryption-config\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265666 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265689 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265712 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b540596-75a7-4dd2-9466-758942da4d0d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265733 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-audit-dir\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265737 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265752 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b540596-75a7-4dd2-9466-758942da4d0d-etcd-client\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265939 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5l4\" (UniqueName: \"kubernetes.io/projected/d5a6b98f-17b6-4e3c-aa64-9b05b9d23547-kube-api-access-2h5l4\") pod \"downloads-7954f5f757-7v67w\" (UID: \"d5a6b98f-17b6-4e3c-aa64-9b05b9d23547\") " pod="openshift-console/downloads-7954f5f757-7v67w" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265956 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265970 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9wd6\" (UniqueName: \"kubernetes.io/projected/51e4222f-7fd5-41eb-afcc-832602668ada-kube-api-access-m9wd6\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265984 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hscc\" (UniqueName: \"kubernetes.io/projected/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-kube-api-access-4hscc\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.265999 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-config\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266013 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c169409-7ddd-4961-b837-847550878691-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266030 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-config\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266060 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51e4222f-7fd5-41eb-afcc-832602668ada-serving-cert\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266075 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-policies\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266090 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-image-import-ca\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266111 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-serving-cert\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266126 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266148 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266168 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-serving-cert\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266182 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266204 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b540596-75a7-4dd2-9466-758942da4d0d-serving-cert\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266217 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b540596-75a7-4dd2-9466-758942da4d0d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266231 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c169409-7ddd-4961-b837-847550878691-images\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266246 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266260 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-client-ca\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266276 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-client-ca\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266289 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-encryption-config\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266305 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-auth-proxy-config\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266319 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-machine-approver-tls\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266334 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13048f18-9e41-4649-a688-311a34f74222-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8jj99\" (UID: \"13048f18-9e41-4649-a688-311a34f74222\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266349 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4qs9\" (UniqueName: \"kubernetes.io/projected/13048f18-9e41-4649-a688-311a34f74222-kube-api-access-g4qs9\") pod \"openshift-apiserver-operator-796bbdcf4f-8jj99\" (UID: \"13048f18-9e41-4649-a688-311a34f74222\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266365 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwzj\" (UniqueName: \"kubernetes.io/projected/7b540596-75a7-4dd2-9466-758942da4d0d-kube-api-access-hhwzj\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266381 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b540596-75a7-4dd2-9466-758942da4d0d-audit-policies\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266395 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b540596-75a7-4dd2-9466-758942da4d0d-audit-dir\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266410 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96d9m\" (UniqueName: \"kubernetes.io/projected/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-kube-api-access-96d9m\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266430 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266451 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.266666 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.267323 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.275076 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.275619 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.281092 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.282771 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.286151 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.286282 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.287070 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.287794 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-77bv2"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.288340 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.288966 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.289197 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.289554 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.289784 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.289985 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.290428 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.296189 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.302900 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-slln5"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.303560 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8n855"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.304034 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gmjd6"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.304268 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.304852 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rqptt"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.305284 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.305545 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.305677 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.309911 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.310869 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.313136 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nhjbl"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.313942 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.314302 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.314738 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.315069 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.320112 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.320638 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.328176 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.328284 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.328916 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.328976 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.329618 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.340397 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.342015 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.342697 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vgx7v"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.343770 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.344422 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.344446 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.347821 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.351771 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.359618 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.360080 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.367831 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.367870 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-serving-cert\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.367912 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.367940 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b540596-75a7-4dd2-9466-758942da4d0d-serving-cert\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.367959 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b540596-75a7-4dd2-9466-758942da4d0d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.367975 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c169409-7ddd-4961-b837-847550878691-images\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.367994 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368022 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-client-ca\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368038 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-encryption-config\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368075 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-client-ca\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368094 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-auth-proxy-config\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368118 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-machine-approver-tls\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368136 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13048f18-9e41-4649-a688-311a34f74222-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8jj99\" (UID: \"13048f18-9e41-4649-a688-311a34f74222\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368154 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4qs9\" (UniqueName: \"kubernetes.io/projected/13048f18-9e41-4649-a688-311a34f74222-kube-api-access-g4qs9\") pod \"openshift-apiserver-operator-796bbdcf4f-8jj99\" (UID: \"13048f18-9e41-4649-a688-311a34f74222\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368177 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0b12a3c-36ae-410a-9b32-71777ada78a8-metrics-tls\") pod \"dns-operator-744455d44c-vr4gw\" (UID: \"c0b12a3c-36ae-410a-9b32-71777ada78a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368199 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b540596-75a7-4dd2-9466-758942da4d0d-audit-policies\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368217 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b540596-75a7-4dd2-9466-758942da4d0d-audit-dir\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368242 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwzj\" (UniqueName: \"kubernetes.io/projected/7b540596-75a7-4dd2-9466-758942da4d0d-kube-api-access-hhwzj\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368264 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96d9m\" (UniqueName: \"kubernetes.io/projected/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-kube-api-access-96d9m\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368288 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368307 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368328 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c169409-7ddd-4961-b837-847550878691-config\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368347 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kthzc\" (UniqueName: \"kubernetes.io/projected/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-kube-api-access-kthzc\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368372 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368392 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4px89\" (UniqueName: \"kubernetes.io/projected/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-kube-api-access-4px89\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368413 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7506da-aefc-4178-b6a2-408e686c8040-serving-cert\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368436 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368458 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368477 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-node-pullsecrets\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368506 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-config\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368526 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368545 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-config\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368563 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-689rt\" (UniqueName: \"kubernetes.io/projected/8f7506da-aefc-4178-b6a2-408e686c8040-kube-api-access-689rt\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368586 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368603 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-etcd-serving-ca\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368619 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-dir\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368651 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-service-ca-bundle\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368668 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-etcd-client\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368701 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp57l\" (UniqueName: \"kubernetes.io/projected/0c169409-7ddd-4961-b837-847550878691-kube-api-access-cp57l\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368720 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-audit\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368738 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13048f18-9e41-4649-a688-311a34f74222-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8jj99\" (UID: \"13048f18-9e41-4649-a688-311a34f74222\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368756 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-config\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368776 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368779 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368794 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368813 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b540596-75a7-4dd2-9466-758942da4d0d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368828 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b540596-75a7-4dd2-9466-758942da4d0d-encryption-config\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368846 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-audit-dir\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368863 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368882 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9wd6\" (UniqueName: \"kubernetes.io/projected/51e4222f-7fd5-41eb-afcc-832602668ada-kube-api-access-m9wd6\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368905 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b540596-75a7-4dd2-9466-758942da4d0d-etcd-client\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368925 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5l4\" (UniqueName: \"kubernetes.io/projected/d5a6b98f-17b6-4e3c-aa64-9b05b9d23547-kube-api-access-2h5l4\") pod \"downloads-7954f5f757-7v67w\" (UID: \"d5a6b98f-17b6-4e3c-aa64-9b05b9d23547\") " pod="openshift-console/downloads-7954f5f757-7v67w" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368947 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c169409-7ddd-4961-b837-847550878691-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368965 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hscc\" (UniqueName: \"kubernetes.io/projected/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-kube-api-access-4hscc\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.368985 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-config\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.369003 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-config\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.369023 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-policies\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.369040 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51e4222f-7fd5-41eb-afcc-832602668ada-serving-cert\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.369083 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcmdc\" (UniqueName: \"kubernetes.io/projected/c0b12a3c-36ae-410a-9b32-71777ada78a8-kube-api-access-tcmdc\") pod \"dns-operator-744455d44c-vr4gw\" (UID: \"c0b12a3c-36ae-410a-9b32-71777ada78a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.369648 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13048f18-9e41-4649-a688-311a34f74222-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8jj99\" (UID: \"13048f18-9e41-4649-a688-311a34f74222\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.370026 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-image-import-ca\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.370101 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.370128 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-serving-cert\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.370150 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.370315 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b540596-75a7-4dd2-9466-758942da4d0d-audit-dir\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.371143 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b540596-75a7-4dd2-9466-758942da4d0d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.371180 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b540596-75a7-4dd2-9466-758942da4d0d-audit-policies\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.371143 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.372028 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-client-ca\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.372209 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-auth-proxy-config\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.372798 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-client-ca\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.373130 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.373299 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-audit-dir\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.374600 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.374804 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c169409-7ddd-4961-b837-847550878691-config\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.375663 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7506da-aefc-4178-b6a2-408e686c8040-serving-cert\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.376214 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-serving-cert\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.376913 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.377675 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-encryption-config\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.377992 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.378087 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.378393 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.378632 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b540596-75a7-4dd2-9466-758942da4d0d-encryption-config\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.379134 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-service-ca-bundle\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.379511 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-config\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.379699 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-d4z2z"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.379709 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-image-import-ca\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.380016 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.380292 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-config\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.380660 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-config\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.381225 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-config\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.381316 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-policies\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.382545 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-machine-approver-tls\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.382696 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vr2qc"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.382821 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.384223 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0c169409-7ddd-4961-b837-847550878691-images\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.384540 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.384586 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-dir\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.384873 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0c169409-7ddd-4961-b837-847550878691-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.384894 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.385008 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b540596-75a7-4dd2-9466-758942da4d0d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.385231 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-node-pullsecrets\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.385774 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-config\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.385841 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.386263 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-audit\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.386099 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.386779 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51e4222f-7fd5-41eb-afcc-832602668ada-serving-cert\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.386991 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.387461 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.387672 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9mx5m"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.387922 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.388404 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.388712 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.389467 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-etcd-client\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.389764 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-etcd-serving-ca\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.390281 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6qtf"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.390885 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b540596-75a7-4dd2-9466-758942da4d0d-serving-cert\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.391258 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.391602 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.391658 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13048f18-9e41-4649-a688-311a34f74222-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8jj99\" (UID: \"13048f18-9e41-4649-a688-311a34f74222\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.392344 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.392390 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.393712 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.393830 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.394191 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.394493 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gj4p8"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.395729 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gmjd6"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.396036 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8n855"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.396207 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-serving-cert\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.397361 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q2ttt"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.398303 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-prx87"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.401182 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.401261 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7v67w"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.401272 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.404771 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vr4gw"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.404817 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.407157 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.407186 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8c67f"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.410334 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.410365 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rqptt"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.410377 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.413543 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nhjbl"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.413576 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.413589 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.414948 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.415783 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.416809 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-77bv2"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.418345 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.419384 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.422530 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-slln5"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.422563 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9m8qp"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.423218 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9m8qp" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.426673 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-776sf"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.427712 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.427733 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.427821 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.430797 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9m8qp"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.441790 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b540596-75a7-4dd2-9466-758942da4d0d-etcd-client\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.442392 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.442764 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.443061 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.453116 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-776sf"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.453181 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tcnj7"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.454976 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.456642 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.459396 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tcnj7"] Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.471189 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcmdc\" (UniqueName: \"kubernetes.io/projected/c0b12a3c-36ae-410a-9b32-71777ada78a8-kube-api-access-tcmdc\") pod \"dns-operator-744455d44c-vr4gw\" (UID: \"c0b12a3c-36ae-410a-9b32-71777ada78a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.471306 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0b12a3c-36ae-410a-9b32-71777ada78a8-metrics-tls\") pod \"dns-operator-744455d44c-vr4gw\" (UID: \"c0b12a3c-36ae-410a-9b32-71777ada78a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.473927 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.474880 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0b12a3c-36ae-410a-9b32-71777ada78a8-metrics-tls\") pod \"dns-operator-744455d44c-vr4gw\" (UID: \"c0b12a3c-36ae-410a-9b32-71777ada78a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.494029 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.514654 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.534629 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.564213 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.574772 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.595496 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.615076 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.654939 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.674652 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.693733 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.715880 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.743749 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.754099 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.774580 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.794576 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.815457 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.834098 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.855373 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.875868 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.894176 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.915163 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.935150 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.956325 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.974581 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 29 18:45:40 crc kubenswrapper[4780]: I0929 18:45:40.995605 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.014500 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.035015 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.056439 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.075018 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.095762 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.114954 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.135946 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.155528 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.175658 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.194460 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.215829 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.235296 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.255365 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.274199 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.296529 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.333352 4780 request.go:700] Waited for 1.011714911s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-controller-manager-operator-config&limit=500&resourceVersion=0 Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.336185 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.356343 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.374447 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.384581 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.384713 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d0d686-bf59-426e-a1d4-99af3f38162f-config\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.384839 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-tls\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.384876 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abd4d246-d458-4255-8edb-f043e8633560-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.386250 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.386494 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f58a165-7d7b-435e-bea2-33d2e8498a1f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fxr6n\" (UID: \"7f58a165-7d7b-435e-bea2-33d2e8498a1f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.386734 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-bound-sa-token\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.386791 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7d0d686-bf59-426e-a1d4-99af3f38162f-trusted-ca\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.386840 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbw85\" (UniqueName: \"kubernetes.io/projected/5d7b3f63-b077-4b71-82cb-9441ffda1b74-kube-api-access-nbw85\") pod \"migrator-59844c95c7-mnkdc\" (UID: \"5d7b3f63-b077-4b71-82cb-9441ffda1b74\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.386909 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/abd4d246-d458-4255-8edb-f043e8633560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.386957 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ppdx\" (UniqueName: \"kubernetes.io/projected/25b697fa-b2da-4f7b-9f70-df23ef21caef-kube-api-access-7ppdx\") pod \"cluster-samples-operator-665b6dd947-qc2c7\" (UID: \"25b697fa-b2da-4f7b-9f70-df23ef21caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.387010 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjt4w\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-kube-api-access-cjt4w\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.387254 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9j2s\" (UniqueName: \"kubernetes.io/projected/abd4d246-d458-4255-8edb-f043e8633560-kube-api-access-x9j2s\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.387474 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-certificates\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.387669 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8d94c958-7aee-4529-9d1d-a961fe232f9b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dnm7s\" (UID: \"8d94c958-7aee-4529-9d1d-a961fe232f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.387823 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.389782 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:41.889756606 +0000 UTC m=+141.838054690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.389894 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f58a165-7d7b-435e-bea2-33d2e8498a1f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fxr6n\" (UID: \"7f58a165-7d7b-435e-bea2-33d2e8498a1f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.390271 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f58a165-7d7b-435e-bea2-33d2e8498a1f-config\") pod \"kube-apiserver-operator-766d6c64bb-fxr6n\" (UID: \"7f58a165-7d7b-435e-bea2-33d2e8498a1f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.391394 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d94c958-7aee-4529-9d1d-a961fe232f9b-serving-cert\") pod \"openshift-config-operator-7777fb866f-dnm7s\" (UID: \"8d94c958-7aee-4529-9d1d-a961fe232f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.392440 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7d0d686-bf59-426e-a1d4-99af3f38162f-serving-cert\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.394200 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25b697fa-b2da-4f7b-9f70-df23ef21caef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qc2c7\" (UID: \"25b697fa-b2da-4f7b-9f70-df23ef21caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.394348 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b62ln\" (UniqueName: \"kubernetes.io/projected/8d94c958-7aee-4529-9d1d-a961fe232f9b-kube-api-access-b62ln\") pod \"openshift-config-operator-7777fb866f-dnm7s\" (UID: \"8d94c958-7aee-4529-9d1d-a961fe232f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.394883 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-trusted-ca\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.394973 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czgdl\" (UniqueName: \"kubernetes.io/projected/e7d0d686-bf59-426e-a1d4-99af3f38162f-kube-api-access-czgdl\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.395177 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abd4d246-d458-4255-8edb-f043e8633560-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.401758 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.416721 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.435006 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.456453 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.473626 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.495169 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.496558 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.496671 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:41.996646826 +0000 UTC m=+141.944944890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.496816 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b62ln\" (UniqueName: \"kubernetes.io/projected/8d94c958-7aee-4529-9d1d-a961fe232f9b-kube-api-access-b62ln\") pod \"openshift-config-operator-7777fb866f-dnm7s\" (UID: \"8d94c958-7aee-4529-9d1d-a961fe232f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.496858 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7f434612-b9f0-48bd-90e9-a8c7c7385466-profile-collector-cert\") pod \"olm-operator-6b444d44fb-p2p79\" (UID: \"7f434612-b9f0-48bd-90e9-a8c7c7385466\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.496882 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a5df5c15-d545-48e6-8d2c-d46b39b6b705-etcd-client\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.496914 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-trusted-ca\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.496933 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26a049a2-59dd-4762-ac58-9eb88fc892a4-metrics-certs\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.496959 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a24d0b10-049f-4082-84c2-06a42e8fa4d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2khnm\" (UID: \"a24d0b10-049f-4082-84c2-06a42e8fa4d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.496978 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-service-ca\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497001 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/83670b30-2222-428b-b4cc-17d16e0bedb2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mkxzc\" (UID: \"83670b30-2222-428b-b4cc-17d16e0bedb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497019 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wlx7\" (UniqueName: \"kubernetes.io/projected/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-kube-api-access-9wlx7\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497075 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-tls\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497094 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4npk\" (UniqueName: \"kubernetes.io/projected/ce5c3243-04a4-4b1d-8800-c10b0b83916d-kube-api-access-j4npk\") pod \"multus-admission-controller-857f4d67dd-8n855\" (UID: \"ce5c3243-04a4-4b1d-8800-c10b0b83916d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497115 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d13c35f8-dc86-4234-b91d-7e9130419e19-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8t9w9\" (UID: \"d13c35f8-dc86-4234-b91d-7e9130419e19\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497134 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc595\" (UniqueName: \"kubernetes.io/projected/b68f3b8f-359e-4bfc-a968-f090b2960ee9-kube-api-access-pc595\") pod \"machine-config-server-9mx5m\" (UID: \"b68f3b8f-359e-4bfc-a968-f090b2960ee9\") " pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497161 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea0b5635-a763-45f8-9529-b57c33c0bef3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497188 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/42152318-f79e-44b4-b9ce-0a9b29eddfff-profile-collector-cert\") pod \"catalog-operator-68c6474976-fg26g\" (UID: \"42152318-f79e-44b4-b9ce-0a9b29eddfff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497207 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/482b8d66-a8d0-4d21-ba06-6f818f092ea7-config-volume\") pod \"collect-profiles-29319525-pq9tq\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497226 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c69dz\" (UniqueName: \"kubernetes.io/projected/be97fd99-dd61-4c66-a928-095939f74649-kube-api-access-c69dz\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497245 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pj4v\" (UniqueName: \"kubernetes.io/projected/2405452c-cfe6-4a52-b1f6-8e5eff36bddf-kube-api-access-2pj4v\") pod \"service-ca-operator-777779d784-rqptt\" (UID: \"2405452c-cfe6-4a52-b1f6-8e5eff36bddf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497263 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/23ba5721-4137-453c-96f7-54a7b3de5902-images\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497279 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea0b5635-a763-45f8-9529-b57c33c0bef3-trusted-ca\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497302 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-slln5\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497322 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d24s\" (UniqueName: \"kubernetes.io/projected/44fa4e91-9dc5-4eec-91bd-482541249e47-kube-api-access-2d24s\") pod \"dns-default-tcnj7\" (UID: \"44fa4e91-9dc5-4eec-91bd-482541249e47\") " pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497341 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxkhh\" (UniqueName: \"kubernetes.io/projected/83670b30-2222-428b-b4cc-17d16e0bedb2-kube-api-access-bxkhh\") pod \"control-plane-machine-set-operator-78cbb6b69f-mkxzc\" (UID: \"83670b30-2222-428b-b4cc-17d16e0bedb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497366 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbw85\" (UniqueName: \"kubernetes.io/projected/5d7b3f63-b077-4b71-82cb-9441ffda1b74-kube-api-access-nbw85\") pod \"migrator-59844c95c7-mnkdc\" (UID: \"5d7b3f63-b077-4b71-82cb-9441ffda1b74\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497385 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw894\" (UniqueName: \"kubernetes.io/projected/23ba5721-4137-453c-96f7-54a7b3de5902-kube-api-access-kw894\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497406 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/140efc80-e021-46f1-8878-4f018e4f33b7-signing-cabundle\") pod \"service-ca-9c57cc56f-gmjd6\" (UID: \"140efc80-e021-46f1-8878-4f018e4f33b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497428 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjt4w\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-kube-api-access-cjt4w\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497450 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/140efc80-e021-46f1-8878-4f018e4f33b7-signing-key\") pod \"service-ca-9c57cc56f-gmjd6\" (UID: \"140efc80-e021-46f1-8878-4f018e4f33b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497478 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-mountpoint-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497506 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9j2s\" (UniqueName: \"kubernetes.io/projected/abd4d246-d458-4255-8edb-f043e8633560-kube-api-access-x9j2s\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497526 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44fa4e91-9dc5-4eec-91bd-482541249e47-metrics-tls\") pod \"dns-default-tcnj7\" (UID: \"44fa4e91-9dc5-4eec-91bd-482541249e47\") " pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497551 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ac77a9-1c0d-4613-a093-98c52157eb53-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ndvdh\" (UID: \"c1ac77a9-1c0d-4613-a093-98c52157eb53\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497574 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8d94c958-7aee-4529-9d1d-a961fe232f9b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dnm7s\" (UID: \"8d94c958-7aee-4529-9d1d-a961fe232f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497597 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497619 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ce5c3243-04a4-4b1d-8800-c10b0b83916d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8n855\" (UID: \"ce5c3243-04a4-4b1d-8800-c10b0b83916d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497639 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a5df5c15-d545-48e6-8d2c-d46b39b6b705-etcd-service-ca\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497661 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f58a165-7d7b-435e-bea2-33d2e8498a1f-config\") pod \"kube-apiserver-operator-766d6c64bb-fxr6n\" (UID: \"7f58a165-7d7b-435e-bea2-33d2e8498a1f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497680 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b68f3b8f-359e-4bfc-a968-f090b2960ee9-node-bootstrap-token\") pod \"machine-config-server-9mx5m\" (UID: \"b68f3b8f-359e-4bfc-a968-f090b2960ee9\") " pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497700 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-registration-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497721 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d94c958-7aee-4529-9d1d-a961fe232f9b-serving-cert\") pod \"openshift-config-operator-7777fb866f-dnm7s\" (UID: \"8d94c958-7aee-4529-9d1d-a961fe232f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497748 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxv98\" (UniqueName: \"kubernetes.io/projected/482b8d66-a8d0-4d21-ba06-6f818f092ea7-kube-api-access-cxv98\") pod \"collect-profiles-29319525-pq9tq\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497770 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khmqh\" (UniqueName: \"kubernetes.io/projected/2db80d2a-2621-41ca-a35f-89640799690b-kube-api-access-khmqh\") pod \"ingress-canary-9m8qp\" (UID: \"2db80d2a-2621-41ca-a35f-89640799690b\") " pod="openshift-ingress-canary/ingress-canary-9m8qp" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497790 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-plugins-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497811 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5df5c15-d545-48e6-8d2c-d46b39b6b705-config\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497831 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgzvj\" (UniqueName: \"kubernetes.io/projected/0052df5f-706f-4dc9-b03e-dbd98d090fb3-kube-api-access-zgzvj\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497849 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b68f3b8f-359e-4bfc-a968-f090b2960ee9-certs\") pod \"machine-config-server-9mx5m\" (UID: \"b68f3b8f-359e-4bfc-a968-f090b2960ee9\") " pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497872 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25b697fa-b2da-4f7b-9f70-df23ef21caef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qc2c7\" (UID: \"25b697fa-b2da-4f7b-9f70-df23ef21caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497895 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d13c35f8-dc86-4234-b91d-7e9130419e19-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8t9w9\" (UID: \"d13c35f8-dc86-4234-b91d-7e9130419e19\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497914 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/26a049a2-59dd-4762-ac58-9eb88fc892a4-stats-auth\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497933 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/42152318-f79e-44b4-b9ce-0a9b29eddfff-srv-cert\") pod \"catalog-operator-68c6474976-fg26g\" (UID: \"42152318-f79e-44b4-b9ce-0a9b29eddfff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497962 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czgdl\" (UniqueName: \"kubernetes.io/projected/e7d0d686-bf59-426e-a1d4-99af3f38162f-kube-api-access-czgdl\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.497982 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-config\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498024 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ac77a9-1c0d-4613-a093-98c52157eb53-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ndvdh\" (UID: \"c1ac77a9-1c0d-4613-a093-98c52157eb53\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498086 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abd4d246-d458-4255-8edb-f043e8633560-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498114 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhkjw\" (UniqueName: \"kubernetes.io/projected/ea0b5635-a763-45f8-9529-b57c33c0bef3-kube-api-access-hhkjw\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498135 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be97fd99-dd61-4c66-a928-095939f74649-webhook-cert\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498156 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c0a510a-24d6-46ba-8ff7-4682156a908d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gpwlq\" (UID: \"9c0a510a-24d6-46ba-8ff7-4682156a908d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498176 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498197 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23ba5721-4137-453c-96f7-54a7b3de5902-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498218 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d0d686-bf59-426e-a1d4-99af3f38162f-config\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498238 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpslx\" (UniqueName: \"kubernetes.io/projected/140efc80-e021-46f1-8878-4f018e4f33b7-kube-api-access-qpslx\") pod \"service-ca-9c57cc56f-gmjd6\" (UID: \"140efc80-e021-46f1-8878-4f018e4f33b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498256 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24d0b10-049f-4082-84c2-06a42e8fa4d9-proxy-tls\") pod \"machine-config-controller-84d6567774-2khnm\" (UID: \"a24d0b10-049f-4082-84c2-06a42e8fa4d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498286 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/482b8d66-a8d0-4d21-ba06-6f818f092ea7-secret-volume\") pod \"collect-profiles-29319525-pq9tq\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498309 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59png\" (UniqueName: \"kubernetes.io/projected/a5df5c15-d545-48e6-8d2c-d46b39b6b705-kube-api-access-59png\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498327 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/26a049a2-59dd-4762-ac58-9eb88fc892a4-default-certificate\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498347 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498368 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abd4d246-d458-4255-8edb-f043e8633560-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498386 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f434612-b9f0-48bd-90e9-a8c7c7385466-srv-cert\") pod \"olm-operator-6b444d44fb-p2p79\" (UID: \"7f434612-b9f0-48bd-90e9-a8c7c7385466\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498406 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-serving-cert\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498445 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0761f136-d154-4872-86b1-04658a564728-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tjr5n\" (UID: \"0761f136-d154-4872-86b1-04658a564728\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498468 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28ea348-5f33-4484-8039-4cba3bb89234-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z4r54\" (UID: \"d28ea348-5f33-4484-8039-4cba3bb89234\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498490 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f58a165-7d7b-435e-bea2-33d2e8498a1f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fxr6n\" (UID: \"7f58a165-7d7b-435e-bea2-33d2e8498a1f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498510 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-oauth-serving-cert\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498531 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-socket-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498554 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhdfq\" (UniqueName: \"kubernetes.io/projected/78d8c0c3-f516-48ef-8279-a2e9e0c04835-kube-api-access-jhdfq\") pod \"marketplace-operator-79b997595-slln5\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498577 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea0b5635-a763-45f8-9529-b57c33c0bef3-metrics-tls\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498595 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2405452c-cfe6-4a52-b1f6-8e5eff36bddf-config\") pod \"service-ca-operator-777779d784-rqptt\" (UID: \"2405452c-cfe6-4a52-b1f6-8e5eff36bddf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498620 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23ba5721-4137-453c-96f7-54a7b3de5902-proxy-tls\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498652 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-bound-sa-token\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498672 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7d0d686-bf59-426e-a1d4-99af3f38162f-trusted-ca\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498690 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms2bd\" (UniqueName: \"kubernetes.io/projected/9c0a510a-24d6-46ba-8ff7-4682156a908d-kube-api-access-ms2bd\") pod \"openshift-controller-manager-operator-756b6f6bc6-gpwlq\" (UID: \"9c0a510a-24d6-46ba-8ff7-4682156a908d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498717 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/abd4d246-d458-4255-8edb-f043e8633560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498736 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ppdx\" (UniqueName: \"kubernetes.io/projected/25b697fa-b2da-4f7b-9f70-df23ef21caef-kube-api-access-7ppdx\") pod \"cluster-samples-operator-665b6dd947-qc2c7\" (UID: \"25b697fa-b2da-4f7b-9f70-df23ef21caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498761 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z6tc\" (UniqueName: \"kubernetes.io/projected/0761f136-d154-4872-86b1-04658a564728-kube-api-access-9z6tc\") pod \"package-server-manager-789f6589d5-tjr5n\" (UID: \"0761f136-d154-4872-86b1-04658a564728\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498781 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vjwq\" (UniqueName: \"kubernetes.io/projected/a24d0b10-049f-4082-84c2-06a42e8fa4d9-kube-api-access-5vjwq\") pod \"machine-config-controller-84d6567774-2khnm\" (UID: \"a24d0b10-049f-4082-84c2-06a42e8fa4d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498800 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-slln5\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498817 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-csi-data-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498837 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-certificates\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498856 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5df5c15-d545-48e6-8d2c-d46b39b6b705-serving-cert\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498877 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dphm9\" (UniqueName: \"kubernetes.io/projected/c1ac77a9-1c0d-4613-a093-98c52157eb53-kube-api-access-dphm9\") pod \"kube-storage-version-migrator-operator-b67b599dd-ndvdh\" (UID: \"c1ac77a9-1c0d-4613-a093-98c52157eb53\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498895 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44fa4e91-9dc5-4eec-91bd-482541249e47-config-volume\") pod \"dns-default-tcnj7\" (UID: \"44fa4e91-9dc5-4eec-91bd-482541249e47\") " pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498913 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2405452c-cfe6-4a52-b1f6-8e5eff36bddf-serving-cert\") pod \"service-ca-operator-777779d784-rqptt\" (UID: \"2405452c-cfe6-4a52-b1f6-8e5eff36bddf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498931 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d28ea348-5f33-4484-8039-4cba3bb89234-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z4r54\" (UID: \"d28ea348-5f33-4484-8039-4cba3bb89234\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498956 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f58a165-7d7b-435e-bea2-33d2e8498a1f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fxr6n\" (UID: \"7f58a165-7d7b-435e-bea2-33d2e8498a1f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.498995 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-trusted-ca-bundle\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499012 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26a049a2-59dd-4762-ac58-9eb88fc892a4-service-ca-bundle\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499031 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c0a510a-24d6-46ba-8ff7-4682156a908d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gpwlq\" (UID: \"9c0a510a-24d6-46ba-8ff7-4682156a908d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499068 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2fw6\" (UniqueName: \"kubernetes.io/projected/26a049a2-59dd-4762-ac58-9eb88fc892a4-kube-api-access-p2fw6\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499087 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p7rk\" (UniqueName: \"kubernetes.io/projected/42152318-f79e-44b4-b9ce-0a9b29eddfff-kube-api-access-2p7rk\") pod \"catalog-operator-68c6474976-fg26g\" (UID: \"42152318-f79e-44b4-b9ce-0a9b29eddfff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499112 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/be97fd99-dd61-4c66-a928-095939f74649-tmpfs\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499129 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28ea348-5f33-4484-8039-4cba3bb89234-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z4r54\" (UID: \"d28ea348-5f33-4484-8039-4cba3bb89234\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499148 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2db80d2a-2621-41ca-a35f-89640799690b-cert\") pod \"ingress-canary-9m8qp\" (UID: \"2db80d2a-2621-41ca-a35f-89640799690b\") " pod="openshift-ingress-canary/ingress-canary-9m8qp" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499183 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7d0d686-bf59-426e-a1d4-99af3f38162f-serving-cert\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499201 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be97fd99-dd61-4c66-a928-095939f74649-apiservice-cert\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499221 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-oauth-config\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499242 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mhrm\" (UniqueName: \"kubernetes.io/projected/7f434612-b9f0-48bd-90e9-a8c7c7385466-kube-api-access-9mhrm\") pod \"olm-operator-6b444d44fb-p2p79\" (UID: \"7f434612-b9f0-48bd-90e9-a8c7c7385466\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499261 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13c35f8-dc86-4234-b91d-7e9130419e19-config\") pod \"kube-controller-manager-operator-78b949d7b-8t9w9\" (UID: \"d13c35f8-dc86-4234-b91d-7e9130419e19\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499282 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a5df5c15-d545-48e6-8d2c-d46b39b6b705-etcd-ca\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.499869 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.500615 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d0d686-bf59-426e-a1d4-99af3f38162f-config\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.502804 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8d94c958-7aee-4529-9d1d-a961fe232f9b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dnm7s\" (UID: \"8d94c958-7aee-4529-9d1d-a961fe232f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.503281 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-trusted-ca\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.503806 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7d0d686-bf59-426e-a1d4-99af3f38162f-trusted-ca\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.504081 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.00400823 +0000 UTC m=+141.952306314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.504112 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abd4d246-d458-4255-8edb-f043e8633560-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.504582 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f58a165-7d7b-435e-bea2-33d2e8498a1f-config\") pod \"kube-apiserver-operator-766d6c64bb-fxr6n\" (UID: \"7f58a165-7d7b-435e-bea2-33d2e8498a1f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.504861 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-certificates\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.507622 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/abd4d246-d458-4255-8edb-f043e8633560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.507006 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f58a165-7d7b-435e-bea2-33d2e8498a1f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fxr6n\" (UID: \"7f58a165-7d7b-435e-bea2-33d2e8498a1f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.510414 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.511904 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25b697fa-b2da-4f7b-9f70-df23ef21caef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qc2c7\" (UID: \"25b697fa-b2da-4f7b-9f70-df23ef21caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.513915 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-tls\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.514292 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7d0d686-bf59-426e-a1d4-99af3f38162f-serving-cert\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.514581 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d94c958-7aee-4529-9d1d-a961fe232f9b-serving-cert\") pod \"openshift-config-operator-7777fb866f-dnm7s\" (UID: \"8d94c958-7aee-4529-9d1d-a961fe232f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.515768 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.536539 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.555241 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.574814 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.595795 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.600727 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.602590 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.102466131 +0000 UTC m=+142.050764185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.602677 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxv98\" (UniqueName: \"kubernetes.io/projected/482b8d66-a8d0-4d21-ba06-6f818f092ea7-kube-api-access-cxv98\") pod \"collect-profiles-29319525-pq9tq\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.602740 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khmqh\" (UniqueName: \"kubernetes.io/projected/2db80d2a-2621-41ca-a35f-89640799690b-kube-api-access-khmqh\") pod \"ingress-canary-9m8qp\" (UID: \"2db80d2a-2621-41ca-a35f-89640799690b\") " pod="openshift-ingress-canary/ingress-canary-9m8qp" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.603015 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-plugins-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.603342 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-plugins-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.603429 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5df5c15-d545-48e6-8d2c-d46b39b6b705-config\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604279 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5df5c15-d545-48e6-8d2c-d46b39b6b705-config\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604306 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgzvj\" (UniqueName: \"kubernetes.io/projected/0052df5f-706f-4dc9-b03e-dbd98d090fb3-kube-api-access-zgzvj\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604366 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b68f3b8f-359e-4bfc-a968-f090b2960ee9-certs\") pod \"machine-config-server-9mx5m\" (UID: \"b68f3b8f-359e-4bfc-a968-f090b2960ee9\") " pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604401 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d13c35f8-dc86-4234-b91d-7e9130419e19-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8t9w9\" (UID: \"d13c35f8-dc86-4234-b91d-7e9130419e19\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604431 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/26a049a2-59dd-4762-ac58-9eb88fc892a4-stats-auth\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604457 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/42152318-f79e-44b4-b9ce-0a9b29eddfff-srv-cert\") pod \"catalog-operator-68c6474976-fg26g\" (UID: \"42152318-f79e-44b4-b9ce-0a9b29eddfff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604509 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-config\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604559 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhkjw\" (UniqueName: \"kubernetes.io/projected/ea0b5635-a763-45f8-9529-b57c33c0bef3-kube-api-access-hhkjw\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604583 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ac77a9-1c0d-4613-a093-98c52157eb53-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ndvdh\" (UID: \"c1ac77a9-1c0d-4613-a093-98c52157eb53\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604608 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be97fd99-dd61-4c66-a928-095939f74649-webhook-cert\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604638 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c0a510a-24d6-46ba-8ff7-4682156a908d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gpwlq\" (UID: \"9c0a510a-24d6-46ba-8ff7-4682156a908d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604669 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23ba5721-4137-453c-96f7-54a7b3de5902-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604707 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpslx\" (UniqueName: \"kubernetes.io/projected/140efc80-e021-46f1-8878-4f018e4f33b7-kube-api-access-qpslx\") pod \"service-ca-9c57cc56f-gmjd6\" (UID: \"140efc80-e021-46f1-8878-4f018e4f33b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604761 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/482b8d66-a8d0-4d21-ba06-6f818f092ea7-secret-volume\") pod \"collect-profiles-29319525-pq9tq\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604788 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59png\" (UniqueName: \"kubernetes.io/projected/a5df5c15-d545-48e6-8d2c-d46b39b6b705-kube-api-access-59png\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604825 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24d0b10-049f-4082-84c2-06a42e8fa4d9-proxy-tls\") pod \"machine-config-controller-84d6567774-2khnm\" (UID: \"a24d0b10-049f-4082-84c2-06a42e8fa4d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.604868 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f434612-b9f0-48bd-90e9-a8c7c7385466-srv-cert\") pod \"olm-operator-6b444d44fb-p2p79\" (UID: \"7f434612-b9f0-48bd-90e9-a8c7c7385466\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605165 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-serving-cert\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605189 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/26a049a2-59dd-4762-ac58-9eb88fc892a4-default-certificate\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605250 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0761f136-d154-4872-86b1-04658a564728-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tjr5n\" (UID: \"0761f136-d154-4872-86b1-04658a564728\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605288 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28ea348-5f33-4484-8039-4cba3bb89234-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z4r54\" (UID: \"d28ea348-5f33-4484-8039-4cba3bb89234\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605321 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-oauth-serving-cert\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605347 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-socket-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605379 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhdfq\" (UniqueName: \"kubernetes.io/projected/78d8c0c3-f516-48ef-8279-a2e9e0c04835-kube-api-access-jhdfq\") pod \"marketplace-operator-79b997595-slln5\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605409 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea0b5635-a763-45f8-9529-b57c33c0bef3-metrics-tls\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605434 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2405452c-cfe6-4a52-b1f6-8e5eff36bddf-config\") pod \"service-ca-operator-777779d784-rqptt\" (UID: \"2405452c-cfe6-4a52-b1f6-8e5eff36bddf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605471 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23ba5721-4137-453c-96f7-54a7b3de5902-proxy-tls\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605509 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms2bd\" (UniqueName: \"kubernetes.io/projected/9c0a510a-24d6-46ba-8ff7-4682156a908d-kube-api-access-ms2bd\") pod \"openshift-controller-manager-operator-756b6f6bc6-gpwlq\" (UID: \"9c0a510a-24d6-46ba-8ff7-4682156a908d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605569 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z6tc\" (UniqueName: \"kubernetes.io/projected/0761f136-d154-4872-86b1-04658a564728-kube-api-access-9z6tc\") pod \"package-server-manager-789f6589d5-tjr5n\" (UID: \"0761f136-d154-4872-86b1-04658a564728\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605595 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vjwq\" (UniqueName: \"kubernetes.io/projected/a24d0b10-049f-4082-84c2-06a42e8fa4d9-kube-api-access-5vjwq\") pod \"machine-config-controller-84d6567774-2khnm\" (UID: \"a24d0b10-049f-4082-84c2-06a42e8fa4d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605618 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-slln5\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605667 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-csi-data-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605702 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5df5c15-d545-48e6-8d2c-d46b39b6b705-serving-cert\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605736 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dphm9\" (UniqueName: \"kubernetes.io/projected/c1ac77a9-1c0d-4613-a093-98c52157eb53-kube-api-access-dphm9\") pod \"kube-storage-version-migrator-operator-b67b599dd-ndvdh\" (UID: \"c1ac77a9-1c0d-4613-a093-98c52157eb53\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605767 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44fa4e91-9dc5-4eec-91bd-482541249e47-config-volume\") pod \"dns-default-tcnj7\" (UID: \"44fa4e91-9dc5-4eec-91bd-482541249e47\") " pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605789 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2405452c-cfe6-4a52-b1f6-8e5eff36bddf-serving-cert\") pod \"service-ca-operator-777779d784-rqptt\" (UID: \"2405452c-cfe6-4a52-b1f6-8e5eff36bddf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605811 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d28ea348-5f33-4484-8039-4cba3bb89234-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z4r54\" (UID: \"d28ea348-5f33-4484-8039-4cba3bb89234\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605845 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-trusted-ca-bundle\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605867 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26a049a2-59dd-4762-ac58-9eb88fc892a4-service-ca-bundle\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605929 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c0a510a-24d6-46ba-8ff7-4682156a908d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gpwlq\" (UID: \"9c0a510a-24d6-46ba-8ff7-4682156a908d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605955 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2fw6\" (UniqueName: \"kubernetes.io/projected/26a049a2-59dd-4762-ac58-9eb88fc892a4-kube-api-access-p2fw6\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605976 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p7rk\" (UniqueName: \"kubernetes.io/projected/42152318-f79e-44b4-b9ce-0a9b29eddfff-kube-api-access-2p7rk\") pod \"catalog-operator-68c6474976-fg26g\" (UID: \"42152318-f79e-44b4-b9ce-0a9b29eddfff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.605976 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ac77a9-1c0d-4613-a093-98c52157eb53-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ndvdh\" (UID: \"c1ac77a9-1c0d-4613-a093-98c52157eb53\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606007 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/be97fd99-dd61-4c66-a928-095939f74649-tmpfs\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606028 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28ea348-5f33-4484-8039-4cba3bb89234-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z4r54\" (UID: \"d28ea348-5f33-4484-8039-4cba3bb89234\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606076 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2db80d2a-2621-41ca-a35f-89640799690b-cert\") pod \"ingress-canary-9m8qp\" (UID: \"2db80d2a-2621-41ca-a35f-89640799690b\") " pod="openshift-ingress-canary/ingress-canary-9m8qp" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606123 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be97fd99-dd61-4c66-a928-095939f74649-apiservice-cert\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606145 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-oauth-config\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606172 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mhrm\" (UniqueName: \"kubernetes.io/projected/7f434612-b9f0-48bd-90e9-a8c7c7385466-kube-api-access-9mhrm\") pod \"olm-operator-6b444d44fb-p2p79\" (UID: \"7f434612-b9f0-48bd-90e9-a8c7c7385466\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606196 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13c35f8-dc86-4234-b91d-7e9130419e19-config\") pod \"kube-controller-manager-operator-78b949d7b-8t9w9\" (UID: \"d13c35f8-dc86-4234-b91d-7e9130419e19\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606225 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a5df5c15-d545-48e6-8d2c-d46b39b6b705-etcd-ca\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606258 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7f434612-b9f0-48bd-90e9-a8c7c7385466-profile-collector-cert\") pod \"olm-operator-6b444d44fb-p2p79\" (UID: \"7f434612-b9f0-48bd-90e9-a8c7c7385466\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606279 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a5df5c15-d545-48e6-8d2c-d46b39b6b705-etcd-client\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606324 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26a049a2-59dd-4762-ac58-9eb88fc892a4-metrics-certs\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606348 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-config\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606370 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a24d0b10-049f-4082-84c2-06a42e8fa4d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2khnm\" (UID: \"a24d0b10-049f-4082-84c2-06a42e8fa4d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606486 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-service-ca\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606519 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28ea348-5f33-4484-8039-4cba3bb89234-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z4r54\" (UID: \"d28ea348-5f33-4484-8039-4cba3bb89234\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606616 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23ba5721-4137-453c-96f7-54a7b3de5902-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606676 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/83670b30-2222-428b-b4cc-17d16e0bedb2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mkxzc\" (UID: \"83670b30-2222-428b-b4cc-17d16e0bedb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606779 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4npk\" (UniqueName: \"kubernetes.io/projected/ce5c3243-04a4-4b1d-8800-c10b0b83916d-kube-api-access-j4npk\") pod \"multus-admission-controller-857f4d67dd-8n855\" (UID: \"ce5c3243-04a4-4b1d-8800-c10b0b83916d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606827 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d13c35f8-dc86-4234-b91d-7e9130419e19-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8t9w9\" (UID: \"d13c35f8-dc86-4234-b91d-7e9130419e19\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606871 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wlx7\" (UniqueName: \"kubernetes.io/projected/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-kube-api-access-9wlx7\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606912 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc595\" (UniqueName: \"kubernetes.io/projected/b68f3b8f-359e-4bfc-a968-f090b2960ee9-kube-api-access-pc595\") pod \"machine-config-server-9mx5m\" (UID: \"b68f3b8f-359e-4bfc-a968-f090b2960ee9\") " pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.606994 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea0b5635-a763-45f8-9529-b57c33c0bef3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607039 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/42152318-f79e-44b4-b9ce-0a9b29eddfff-profile-collector-cert\") pod \"catalog-operator-68c6474976-fg26g\" (UID: \"42152318-f79e-44b4-b9ce-0a9b29eddfff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607116 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/482b8d66-a8d0-4d21-ba06-6f818f092ea7-config-volume\") pod \"collect-profiles-29319525-pq9tq\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607157 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c69dz\" (UniqueName: \"kubernetes.io/projected/be97fd99-dd61-4c66-a928-095939f74649-kube-api-access-c69dz\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607205 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pj4v\" (UniqueName: \"kubernetes.io/projected/2405452c-cfe6-4a52-b1f6-8e5eff36bddf-kube-api-access-2pj4v\") pod \"service-ca-operator-777779d784-rqptt\" (UID: \"2405452c-cfe6-4a52-b1f6-8e5eff36bddf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607240 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/23ba5721-4137-453c-96f7-54a7b3de5902-images\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607282 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea0b5635-a763-45f8-9529-b57c33c0bef3-trusted-ca\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607327 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-slln5\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607363 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d24s\" (UniqueName: \"kubernetes.io/projected/44fa4e91-9dc5-4eec-91bd-482541249e47-kube-api-access-2d24s\") pod \"dns-default-tcnj7\" (UID: \"44fa4e91-9dc5-4eec-91bd-482541249e47\") " pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607397 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxkhh\" (UniqueName: \"kubernetes.io/projected/83670b30-2222-428b-b4cc-17d16e0bedb2-kube-api-access-bxkhh\") pod \"control-plane-machine-set-operator-78cbb6b69f-mkxzc\" (UID: \"83670b30-2222-428b-b4cc-17d16e0bedb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607430 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a24d0b10-049f-4082-84c2-06a42e8fa4d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2khnm\" (UID: \"a24d0b10-049f-4082-84c2-06a42e8fa4d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607456 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw894\" (UniqueName: \"kubernetes.io/projected/23ba5721-4137-453c-96f7-54a7b3de5902-kube-api-access-kw894\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607500 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/140efc80-e021-46f1-8878-4f018e4f33b7-signing-cabundle\") pod \"service-ca-9c57cc56f-gmjd6\" (UID: \"140efc80-e021-46f1-8878-4f018e4f33b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607549 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/140efc80-e021-46f1-8878-4f018e4f33b7-signing-key\") pod \"service-ca-9c57cc56f-gmjd6\" (UID: \"140efc80-e021-46f1-8878-4f018e4f33b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607603 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/be97fd99-dd61-4c66-a928-095939f74649-tmpfs\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607718 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-mountpoint-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607608 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-mountpoint-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607838 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44fa4e91-9dc5-4eec-91bd-482541249e47-metrics-tls\") pod \"dns-default-tcnj7\" (UID: \"44fa4e91-9dc5-4eec-91bd-482541249e47\") " pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607885 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ac77a9-1c0d-4613-a093-98c52157eb53-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ndvdh\" (UID: \"c1ac77a9-1c0d-4613-a093-98c52157eb53\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607946 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607988 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ce5c3243-04a4-4b1d-8800-c10b0b83916d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8n855\" (UID: \"ce5c3243-04a4-4b1d-8800-c10b0b83916d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.608026 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a5df5c15-d545-48e6-8d2c-d46b39b6b705-etcd-service-ca\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.608107 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b68f3b8f-359e-4bfc-a968-f090b2960ee9-node-bootstrap-token\") pod \"machine-config-server-9mx5m\" (UID: \"b68f3b8f-359e-4bfc-a968-f090b2960ee9\") " pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.608152 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-registration-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607590 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-oauth-serving-cert\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.607759 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-socket-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.608670 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2405452c-cfe6-4a52-b1f6-8e5eff36bddf-config\") pod \"service-ca-operator-777779d784-rqptt\" (UID: \"2405452c-cfe6-4a52-b1f6-8e5eff36bddf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.608775 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-service-ca\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.609252 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.109225835 +0000 UTC m=+142.057523919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.610391 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13c35f8-dc86-4234-b91d-7e9130419e19-config\") pod \"kube-controller-manager-operator-78b949d7b-8t9w9\" (UID: \"d13c35f8-dc86-4234-b91d-7e9130419e19\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.610796 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-csi-data-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.611242 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/42152318-f79e-44b4-b9ce-0a9b29eddfff-srv-cert\") pod \"catalog-operator-68c6474976-fg26g\" (UID: \"42152318-f79e-44b4-b9ce-0a9b29eddfff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.611777 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a5df5c15-d545-48e6-8d2c-d46b39b6b705-etcd-service-ca\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.612122 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c0a510a-24d6-46ba-8ff7-4682156a908d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gpwlq\" (UID: \"9c0a510a-24d6-46ba-8ff7-4682156a908d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.612292 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0761f136-d154-4872-86b1-04658a564728-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tjr5n\" (UID: \"0761f136-d154-4872-86b1-04658a564728\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.613090 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a5df5c15-d545-48e6-8d2c-d46b39b6b705-etcd-ca\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.613261 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-trusted-ca-bundle\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.613981 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/83670b30-2222-428b-b4cc-17d16e0bedb2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mkxzc\" (UID: \"83670b30-2222-428b-b4cc-17d16e0bedb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.615272 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be97fd99-dd61-4c66-a928-095939f74649-webhook-cert\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.615341 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/482b8d66-a8d0-4d21-ba06-6f818f092ea7-secret-volume\") pod \"collect-profiles-29319525-pq9tq\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.615864 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-serving-cert\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.616016 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-slln5\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.616109 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/23ba5721-4137-453c-96f7-54a7b3de5902-images\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.616247 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0052df5f-706f-4dc9-b03e-dbd98d090fb3-registration-dir\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.616775 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23ba5721-4137-453c-96f7-54a7b3de5902-proxy-tls\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.617141 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28ea348-5f33-4484-8039-4cba3bb89234-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z4r54\" (UID: \"d28ea348-5f33-4484-8039-4cba3bb89234\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.617171 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/482b8d66-a8d0-4d21-ba06-6f818f092ea7-config-volume\") pod \"collect-profiles-29319525-pq9tq\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.617991 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.618033 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be97fd99-dd61-4c66-a928-095939f74649-apiservice-cert\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.618933 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-slln5\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.619559 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/140efc80-e021-46f1-8878-4f018e4f33b7-signing-cabundle\") pod \"service-ca-9c57cc56f-gmjd6\" (UID: \"140efc80-e021-46f1-8878-4f018e4f33b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.619672 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d13c35f8-dc86-4234-b91d-7e9130419e19-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8t9w9\" (UID: \"d13c35f8-dc86-4234-b91d-7e9130419e19\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.621290 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c0a510a-24d6-46ba-8ff7-4682156a908d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gpwlq\" (UID: \"9c0a510a-24d6-46ba-8ff7-4682156a908d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.621494 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-oauth-config\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.621509 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ce5c3243-04a4-4b1d-8800-c10b0b83916d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8n855\" (UID: \"ce5c3243-04a4-4b1d-8800-c10b0b83916d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.621617 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7f434612-b9f0-48bd-90e9-a8c7c7385466-profile-collector-cert\") pod \"olm-operator-6b444d44fb-p2p79\" (UID: \"7f434612-b9f0-48bd-90e9-a8c7c7385466\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.621773 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5df5c15-d545-48e6-8d2c-d46b39b6b705-serving-cert\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.622263 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/42152318-f79e-44b4-b9ce-0a9b29eddfff-profile-collector-cert\") pod \"catalog-operator-68c6474976-fg26g\" (UID: \"42152318-f79e-44b4-b9ce-0a9b29eddfff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.623667 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a5df5c15-d545-48e6-8d2c-d46b39b6b705-etcd-client\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.625081 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2405452c-cfe6-4a52-b1f6-8e5eff36bddf-serving-cert\") pod \"service-ca-operator-777779d784-rqptt\" (UID: \"2405452c-cfe6-4a52-b1f6-8e5eff36bddf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.625697 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ac77a9-1c0d-4613-a093-98c52157eb53-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ndvdh\" (UID: \"c1ac77a9-1c0d-4613-a093-98c52157eb53\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.630726 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/140efc80-e021-46f1-8878-4f018e4f33b7-signing-key\") pod \"service-ca-9c57cc56f-gmjd6\" (UID: \"140efc80-e021-46f1-8878-4f018e4f33b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.634611 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.654849 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.675548 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.687904 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26a049a2-59dd-4762-ac58-9eb88fc892a4-metrics-certs\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.695596 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.699715 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/26a049a2-59dd-4762-ac58-9eb88fc892a4-default-certificate\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.708668 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.709068 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.209016861 +0000 UTC m=+142.157314915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.709412 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.709962 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.209936011 +0000 UTC m=+142.158234125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.714726 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.718999 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/26a049a2-59dd-4762-ac58-9eb88fc892a4-stats-auth\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.735228 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.743405 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26a049a2-59dd-4762-ac58-9eb88fc892a4-service-ca-bundle\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.755680 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.776561 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.794853 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.810458 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.810715 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.310669408 +0000 UTC m=+142.258967492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.810961 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.811465 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.311444784 +0000 UTC m=+142.259742838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.815532 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.835416 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.845501 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea0b5635-a763-45f8-9529-b57c33c0bef3-metrics-tls\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.868598 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.875238 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.877184 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea0b5635-a763-45f8-9529-b57c33c0bef3-trusted-ca\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.895148 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.903289 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f434612-b9f0-48bd-90e9-a8c7c7385466-srv-cert\") pod \"olm-operator-6b444d44fb-p2p79\" (UID: \"7f434612-b9f0-48bd-90e9-a8c7c7385466\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.913257 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.913496 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.413440812 +0000 UTC m=+142.361738866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.914513 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:41 crc kubenswrapper[4780]: E0929 18:45:41.915095 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.415083487 +0000 UTC m=+142.363381551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.941320 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96d9m\" (UniqueName: \"kubernetes.io/projected/7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a-kube-api-access-96d9m\") pod \"apiserver-76f77b778f-gj4p8\" (UID: \"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a\") " pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.965245 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4qs9\" (UniqueName: \"kubernetes.io/projected/13048f18-9e41-4649-a688-311a34f74222-kube-api-access-g4qs9\") pod \"openshift-apiserver-operator-796bbdcf4f-8jj99\" (UID: \"13048f18-9e41-4649-a688-311a34f74222\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.983747 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwzj\" (UniqueName: \"kubernetes.io/projected/7b540596-75a7-4dd2-9466-758942da4d0d-kube-api-access-hhwzj\") pod \"apiserver-7bbb656c7d-qnzcj\" (UID: \"7b540596-75a7-4dd2-9466-758942da4d0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:41 crc kubenswrapper[4780]: I0929 18:45:41.985327 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.001874 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hscc\" (UniqueName: \"kubernetes.io/projected/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-kube-api-access-4hscc\") pod \"oauth-openshift-558db77b4-n6qtf\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.015931 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.016195 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.516151114 +0000 UTC m=+142.464449168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.016396 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.016895 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.516874288 +0000 UTC m=+142.465172352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.024694 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kthzc\" (UniqueName: \"kubernetes.io/projected/22e66bf0-740b-46c9-aa4c-3a26bfc49ba7-kube-api-access-kthzc\") pod \"authentication-operator-69f744f599-q2ttt\" (UID: \"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.036831 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4px89\" (UniqueName: \"kubernetes.io/projected/268fec05-58bc-4843-b4c5-4bcc6d9cb8d0-kube-api-access-4px89\") pod \"machine-approver-56656f9798-8zct4\" (UID: \"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.055280 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.063870 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9wd6\" (UniqueName: \"kubernetes.io/projected/51e4222f-7fd5-41eb-afcc-832602668ada-kube-api-access-m9wd6\") pod \"controller-manager-879f6c89f-vr2qc\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.063879 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a24d0b10-049f-4082-84c2-06a42e8fa4d9-proxy-tls\") pod \"machine-config-controller-84d6567774-2khnm\" (UID: \"a24d0b10-049f-4082-84c2-06a42e8fa4d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.075205 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.078291 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.104148 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.119248 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.119960 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.619939282 +0000 UTC m=+142.568237336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.122020 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.125345 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-689rt\" (UniqueName: \"kubernetes.io/projected/8f7506da-aefc-4178-b6a2-408e686c8040-kube-api-access-689rt\") pod \"route-controller-manager-6576b87f9c-w8l7s\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.152230 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5l4\" (UniqueName: \"kubernetes.io/projected/d5a6b98f-17b6-4e3c-aa64-9b05b9d23547-kube-api-access-2h5l4\") pod \"downloads-7954f5f757-7v67w\" (UID: \"d5a6b98f-17b6-4e3c-aa64-9b05b9d23547\") " pod="openshift-console/downloads-7954f5f757-7v67w" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.155735 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.163930 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp57l\" (UniqueName: \"kubernetes.io/projected/0c169409-7ddd-4961-b837-847550878691-kube-api-access-cp57l\") pod \"machine-api-operator-5694c8668f-d4z2z\" (UID: \"0c169409-7ddd-4961-b837-847550878691\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.165661 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b68f3b8f-359e-4bfc-a968-f090b2960ee9-certs\") pod \"machine-config-server-9mx5m\" (UID: \"b68f3b8f-359e-4bfc-a968-f090b2960ee9\") " pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.176486 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.195522 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.208089 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b68f3b8f-359e-4bfc-a968-f090b2960ee9-node-bootstrap-token\") pod \"machine-config-server-9mx5m\" (UID: \"b68f3b8f-359e-4bfc-a968-f090b2960ee9\") " pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.212083 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.214872 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.221137 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.221559 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.721545578 +0000 UTC m=+142.669843622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.235513 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.249655 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj"] Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.255858 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.258066 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.269778 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.276024 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.294665 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.303270 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7v67w" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.314390 4780 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.322481 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.322821 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.822767071 +0000 UTC m=+142.771065165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.323536 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.324062 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.824037833 +0000 UTC m=+142.772335877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.329901 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.335159 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.353024 4780 request.go:700] Waited for 1.896080164s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.355243 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.374918 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.386754 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.395517 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.401468 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44fa4e91-9dc5-4eec-91bd-482541249e47-config-volume\") pod \"dns-default-tcnj7\" (UID: \"44fa4e91-9dc5-4eec-91bd-482541249e47\") " pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.424306 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.424510 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.924470349 +0000 UTC m=+142.872768403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.425267 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.425915 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:42.925902927 +0000 UTC m=+142.874200981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.429581 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44fa4e91-9dc5-4eec-91bd-482541249e47-metrics-tls\") pod \"dns-default-tcnj7\" (UID: \"44fa4e91-9dc5-4eec-91bd-482541249e47\") " pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.431962 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2db80d2a-2621-41ca-a35f-89640799690b-cert\") pod \"ingress-canary-9m8qp\" (UID: \"2db80d2a-2621-41ca-a35f-89640799690b\") " pod="openshift-ingress-canary/ingress-canary-9m8qp" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.437612 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcmdc\" (UniqueName: \"kubernetes.io/projected/c0b12a3c-36ae-410a-9b32-71777ada78a8-kube-api-access-tcmdc\") pod \"dns-operator-744455d44c-vr4gw\" (UID: \"c0b12a3c-36ae-410a-9b32-71777ada78a8\") " pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.451819 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.486132 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czgdl\" (UniqueName: \"kubernetes.io/projected/e7d0d686-bf59-426e-a1d4-99af3f38162f-kube-api-access-czgdl\") pod \"console-operator-58897d9998-prx87\" (UID: \"e7d0d686-bf59-426e-a1d4-99af3f38162f\") " pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.490890 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/abd4d246-d458-4255-8edb-f043e8633560-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.515295 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b62ln\" (UniqueName: \"kubernetes.io/projected/8d94c958-7aee-4529-9d1d-a961fe232f9b-kube-api-access-b62ln\") pod \"openshift-config-operator-7777fb866f-dnm7s\" (UID: \"8d94c958-7aee-4529-9d1d-a961fe232f9b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.519419 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.528670 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.540568 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjt4w\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-kube-api-access-cjt4w\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.545153 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.545354 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.045329363 +0000 UTC m=+142.993627407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.577127 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6qtf"] Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.583773 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f58a165-7d7b-435e-bea2-33d2e8498a1f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fxr6n\" (UID: \"7f58a165-7d7b-435e-bea2-33d2e8498a1f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.588346 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9j2s\" (UniqueName: \"kubernetes.io/projected/abd4d246-d458-4255-8edb-f043e8633560-kube-api-access-x9j2s\") pod \"cluster-image-registry-operator-dc59b4c8b-62klh\" (UID: \"abd4d246-d458-4255-8edb-f043e8633560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.608325 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-bound-sa-token\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.613803 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbw85\" (UniqueName: \"kubernetes.io/projected/5d7b3f63-b077-4b71-82cb-9441ffda1b74-kube-api-access-nbw85\") pod \"migrator-59844c95c7-mnkdc\" (UID: \"5d7b3f63-b077-4b71-82cb-9441ffda1b74\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.630497 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" event={"ID":"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0","Type":"ContainerStarted","Data":"39a4461b7f9d2fd8097010a0749b0f8714f2e725a68bf1bf6bb09488da58d061"} Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.632346 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" event={"ID":"7b540596-75a7-4dd2-9466-758942da4d0d","Type":"ContainerStarted","Data":"98412c5526df2c633482ed8abe0c768b7ee9a742b6916fe02f4cf033ecb6ef7a"} Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.639421 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ppdx\" (UniqueName: \"kubernetes.io/projected/25b697fa-b2da-4f7b-9f70-df23ef21caef-kube-api-access-7ppdx\") pod \"cluster-samples-operator-665b6dd947-qc2c7\" (UID: \"25b697fa-b2da-4f7b-9f70-df23ef21caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" Sep 29 18:45:42 crc kubenswrapper[4780]: W0929 18:45:42.642086 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac14e5ca_d14b_4f77_9cd0_3aba2a3f3019.slice/crio-e6af6f211b2f7708ff750bba4fc6867a0d4b97da2d6b4ea2b038e8b086fa176c WatchSource:0}: Error finding container e6af6f211b2f7708ff750bba4fc6867a0d4b97da2d6b4ea2b038e8b086fa176c: Status 404 returned error can't find the container with id e6af6f211b2f7708ff750bba4fc6867a0d4b97da2d6b4ea2b038e8b086fa176c Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.645403 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.645756 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.145742819 +0000 UTC m=+143.094040863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.649866 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gj4p8"] Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.666351 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxv98\" (UniqueName: \"kubernetes.io/projected/482b8d66-a8d0-4d21-ba06-6f818f092ea7-kube-api-access-cxv98\") pod \"collect-profiles-29319525-pq9tq\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.672290 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q2ttt"] Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.675024 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khmqh\" (UniqueName: \"kubernetes.io/projected/2db80d2a-2621-41ca-a35f-89640799690b-kube-api-access-khmqh\") pod \"ingress-canary-9m8qp\" (UID: \"2db80d2a-2621-41ca-a35f-89640799690b\") " pod="openshift-ingress-canary/ingress-canary-9m8qp" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.689870 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.691258 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgzvj\" (UniqueName: \"kubernetes.io/projected/0052df5f-706f-4dc9-b03e-dbd98d090fb3-kube-api-access-zgzvj\") pod \"csi-hostpathplugin-776sf\" (UID: \"0052df5f-706f-4dc9-b03e-dbd98d090fb3\") " pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.717559 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.718358 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d13c35f8-dc86-4234-b91d-7e9130419e19-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8t9w9\" (UID: \"d13c35f8-dc86-4234-b91d-7e9130419e19\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.728772 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59png\" (UniqueName: \"kubernetes.io/projected/a5df5c15-d545-48e6-8d2c-d46b39b6b705-kube-api-access-59png\") pod \"etcd-operator-b45778765-nhjbl\" (UID: \"a5df5c15-d545-48e6-8d2c-d46b39b6b705\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.746224 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.746774 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.246755535 +0000 UTC m=+143.195053579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.749881 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9m8qp" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.756488 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpslx\" (UniqueName: \"kubernetes.io/projected/140efc80-e021-46f1-8878-4f018e4f33b7-kube-api-access-qpslx\") pod \"service-ca-9c57cc56f-gmjd6\" (UID: \"140efc80-e021-46f1-8878-4f018e4f33b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.769375 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.769560 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-776sf" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.773112 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhkjw\" (UniqueName: \"kubernetes.io/projected/ea0b5635-a763-45f8-9529-b57c33c0bef3-kube-api-access-hhkjw\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.793656 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d28ea348-5f33-4484-8039-4cba3bb89234-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z4r54\" (UID: \"d28ea348-5f33-4484-8039-4cba3bb89234\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.815264 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p7rk\" (UniqueName: \"kubernetes.io/projected/42152318-f79e-44b4-b9ce-0a9b29eddfff-kube-api-access-2p7rk\") pod \"catalog-operator-68c6474976-fg26g\" (UID: \"42152318-f79e-44b4-b9ce-0a9b29eddfff\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.830627 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2fw6\" (UniqueName: \"kubernetes.io/projected/26a049a2-59dd-4762-ac58-9eb88fc892a4-kube-api-access-p2fw6\") pod \"router-default-5444994796-vgx7v\" (UID: \"26a049a2-59dd-4762-ac58-9eb88fc892a4\") " pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.848596 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.849169 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.349141646 +0000 UTC m=+143.297439690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.856931 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z6tc\" (UniqueName: \"kubernetes.io/projected/0761f136-d154-4872-86b1-04658a564728-kube-api-access-9z6tc\") pod \"package-server-manager-789f6589d5-tjr5n\" (UID: \"0761f136-d154-4872-86b1-04658a564728\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.864774 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.870419 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.877146 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vjwq\" (UniqueName: \"kubernetes.io/projected/a24d0b10-049f-4082-84c2-06a42e8fa4d9-kube-api-access-5vjwq\") pod \"machine-config-controller-84d6567774-2khnm\" (UID: \"a24d0b10-049f-4082-84c2-06a42e8fa4d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.895786 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.896899 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhdfq\" (UniqueName: \"kubernetes.io/projected/78d8c0c3-f516-48ef-8279-a2e9e0c04835-kube-api-access-jhdfq\") pod \"marketplace-operator-79b997595-slln5\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.908592 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.913886 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dphm9\" (UniqueName: \"kubernetes.io/projected/c1ac77a9-1c0d-4613-a093-98c52157eb53-kube-api-access-dphm9\") pod \"kube-storage-version-migrator-operator-b67b599dd-ndvdh\" (UID: \"c1ac77a9-1c0d-4613-a093-98c52157eb53\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.933099 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms2bd\" (UniqueName: \"kubernetes.io/projected/9c0a510a-24d6-46ba-8ff7-4682156a908d-kube-api-access-ms2bd\") pod \"openshift-controller-manager-operator-756b6f6bc6-gpwlq\" (UID: \"9c0a510a-24d6-46ba-8ff7-4682156a908d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.933735 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.949726 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:42 crc kubenswrapper[4780]: E0929 18:45:42.950216 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.450199493 +0000 UTC m=+143.398497527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.957405 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.959001 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4npk\" (UniqueName: \"kubernetes.io/projected/ce5c3243-04a4-4b1d-8800-c10b0b83916d-kube-api-access-j4npk\") pod \"multus-admission-controller-857f4d67dd-8n855\" (UID: \"ce5c3243-04a4-4b1d-8800-c10b0b83916d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.965892 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.971605 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wlx7\" (UniqueName: \"kubernetes.io/projected/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-kube-api-access-9wlx7\") pod \"console-f9d7485db-77bv2\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.973185 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.981793 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.994886 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc595\" (UniqueName: \"kubernetes.io/projected/b68f3b8f-359e-4bfc-a968-f090b2960ee9-kube-api-access-pc595\") pod \"machine-config-server-9mx5m\" (UID: \"b68f3b8f-359e-4bfc-a968-f090b2960ee9\") " pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:42 crc kubenswrapper[4780]: I0929 18:45:42.997141 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.003411 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.011702 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.017345 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea0b5635-a763-45f8-9529-b57c33c0bef3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4zqqt\" (UID: \"ea0b5635-a763-45f8-9529-b57c33c0bef3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.024227 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.035024 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.042278 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9mx5m" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.050642 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mhrm\" (UniqueName: \"kubernetes.io/projected/7f434612-b9f0-48bd-90e9-a8c7c7385466-kube-api-access-9mhrm\") pod \"olm-operator-6b444d44fb-p2p79\" (UID: \"7f434612-b9f0-48bd-90e9-a8c7c7385466\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.051510 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.052008 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.551992625 +0000 UTC m=+143.500290669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.059913 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vr2qc"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.069262 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d24s\" (UniqueName: \"kubernetes.io/projected/44fa4e91-9dc5-4eec-91bd-482541249e47-kube-api-access-2d24s\") pod \"dns-default-tcnj7\" (UID: \"44fa4e91-9dc5-4eec-91bd-482541249e47\") " pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.072681 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.072838 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.085928 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pj4v\" (UniqueName: \"kubernetes.io/projected/2405452c-cfe6-4a52-b1f6-8e5eff36bddf-kube-api-access-2pj4v\") pod \"service-ca-operator-777779d784-rqptt\" (UID: \"2405452c-cfe6-4a52-b1f6-8e5eff36bddf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.110268 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69dz\" (UniqueName: \"kubernetes.io/projected/be97fd99-dd61-4c66-a928-095939f74649-kube-api-access-c69dz\") pod \"packageserver-d55dfcdfc-t7zv9\" (UID: \"be97fd99-dd61-4c66-a928-095939f74649\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.142572 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.148795 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.160086 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw894\" (UniqueName: \"kubernetes.io/projected/23ba5721-4137-453c-96f7-54a7b3de5902-kube-api-access-kw894\") pod \"machine-config-operator-74547568cd-7zgvf\" (UID: \"23ba5721-4137-453c-96f7-54a7b3de5902\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.163673 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.164271 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.664234333 +0000 UTC m=+143.612532377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.167147 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.169319 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7v67w"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.170491 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vr4gw"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.172827 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxkhh\" (UniqueName: \"kubernetes.io/projected/83670b30-2222-428b-b4cc-17d16e0bedb2-kube-api-access-bxkhh\") pod \"control-plane-machine-set-operator-78cbb6b69f-mkxzc\" (UID: \"83670b30-2222-428b-b4cc-17d16e0bedb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.185827 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.199884 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.209850 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-d4z2z"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.221565 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.231264 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.242539 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.254884 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.265240 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.265709 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.765695214 +0000 UTC m=+143.713993258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.268423 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.285404 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-prx87"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.325829 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.368871 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.369907 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.869552914 +0000 UTC m=+143.817850958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.470268 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.471150 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:43.971028085 +0000 UTC m=+143.919326129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.511577 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9m8qp"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.515613 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-776sf"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.572800 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.573213 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.073193719 +0000 UTC m=+144.021491763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.654179 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7v67w" event={"ID":"d5a6b98f-17b6-4e3c-aa64-9b05b9d23547","Type":"ContainerStarted","Data":"9d5a047e5733519d76064e8f2485445127a53b4bc6cafba6bc30a47df48c3aef"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.656378 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" event={"ID":"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0","Type":"ContainerStarted","Data":"8a2767428d78ff8e1459d5ce2f18eae5f9f7a62030085eab0ee0b0d746ee04e5"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.670788 4780 generic.go:334] "Generic (PLEG): container finished" podID="7b540596-75a7-4dd2-9466-758942da4d0d" containerID="d4ae46897115019e1331bfa0396004ab89803bbf95eb69e69ddca8c148ba77d7" exitCode=0 Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.670914 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" event={"ID":"7b540596-75a7-4dd2-9466-758942da4d0d","Type":"ContainerDied","Data":"d4ae46897115019e1331bfa0396004ab89803bbf95eb69e69ddca8c148ba77d7"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.675942 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.677011 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.176997388 +0000 UTC m=+144.125295432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.681631 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" event={"ID":"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019","Type":"ContainerStarted","Data":"52e4b79e248b284e7fa1d037412a9740a36f93f45f76d62188a157ae938368a1"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.681714 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.681727 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" event={"ID":"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019","Type":"ContainerStarted","Data":"e6af6f211b2f7708ff750bba4fc6867a0d4b97da2d6b4ea2b038e8b086fa176c"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.687732 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" event={"ID":"13048f18-9e41-4649-a688-311a34f74222","Type":"ContainerStarted","Data":"fb7c1cf789a292560028269a24f633953eff7763bdfc75a35cb1101d868af013"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.689755 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9mx5m" event={"ID":"b68f3b8f-359e-4bfc-a968-f090b2960ee9","Type":"ContainerStarted","Data":"194ae26b90f845ce683fca92c12721a2bc7023c7d33f17318ac3c152577dc791"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.730116 4780 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-n6qtf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.730514 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" podUID="ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.738697 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" event={"ID":"482b8d66-a8d0-4d21-ba06-6f818f092ea7","Type":"ContainerStarted","Data":"f2ddfc7dd0fa443fadfdc6e9bb6891916bf6893c17e707ce5d8a2d7cb12576c9"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.750597 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" event={"ID":"8d94c958-7aee-4529-9d1d-a961fe232f9b","Type":"ContainerStarted","Data":"470957956acbc2c289db10a9865a8b41385bf4d7b02b53cb729c69f5186399b3"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.760424 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" event={"ID":"0c169409-7ddd-4961-b837-847550878691","Type":"ContainerStarted","Data":"def0086c03cc21eab8dde9fe1508c6cf2a29b6df827f8fb841418f989ccfd173"} Sep 29 18:45:43 crc kubenswrapper[4780]: W0929 18:45:43.761096 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db80d2a_2621_41ca_a35f_89640799690b.slice/crio-1535f72fa6f8f1ae4b5413762dc9c1b561ab9da0d9461ed9b1a28eeba7449341 WatchSource:0}: Error finding container 1535f72fa6f8f1ae4b5413762dc9c1b561ab9da0d9461ed9b1a28eeba7449341: Status 404 returned error can't find the container with id 1535f72fa6f8f1ae4b5413762dc9c1b561ab9da0d9461ed9b1a28eeba7449341 Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.779839 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.780239 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.280194536 +0000 UTC m=+144.228492590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.785640 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.790141 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.290076133 +0000 UTC m=+144.238374177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.800948 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" event={"ID":"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7","Type":"ContainerStarted","Data":"ac2a4452e39db7aabb9d38caa7acae140265328c380078b5c283b13b3d6a20fd"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.801271 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" event={"ID":"22e66bf0-740b-46c9-aa4c-3a26bfc49ba7","Type":"ContainerStarted","Data":"39e10ff5ff5e6c9c2c42d4ff98920b5da9b4475561e261ca3f7bd977bbb14ae7"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.804097 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" event={"ID":"25b697fa-b2da-4f7b-9f70-df23ef21caef","Type":"ContainerStarted","Data":"d3542865d21f688bdda44c39861da233087c6784ac748248ef34387c7af8f928"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.808179 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" event={"ID":"51e4222f-7fd5-41eb-afcc-832602668ada","Type":"ContainerStarted","Data":"1339325007e9539dc229362367cfee2f2606d94b6299f36fb5ceb0f17cefe24c"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.816087 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" event={"ID":"8f7506da-aefc-4178-b6a2-408e686c8040","Type":"ContainerStarted","Data":"9883ee86b687d3441df05a4f57db15305a84967dea4ab43b2d45aa4db294933e"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.822501 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vgx7v" event={"ID":"26a049a2-59dd-4762-ac58-9eb88fc892a4","Type":"ContainerStarted","Data":"6beeb2b6984e243c5af25fe4d67d54f6bcf3636b49d1802161ebd4886488e8a6"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.826351 4780 generic.go:334] "Generic (PLEG): container finished" podID="7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a" containerID="8df88e53af7bb44a55fed13d984d3c4afe4b4b495f5bb5cb4cb25c150330d8e4" exitCode=0 Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.826431 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" event={"ID":"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a","Type":"ContainerDied","Data":"8df88e53af7bb44a55fed13d984d3c4afe4b4b495f5bb5cb4cb25c150330d8e4"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.826522 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" event={"ID":"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a","Type":"ContainerStarted","Data":"50f0a38afe964c61d146a75911c58210c7ab03e9f5be75fc32d8aa80e257c1e0"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.830713 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-prx87" event={"ID":"e7d0d686-bf59-426e-a1d4-99af3f38162f","Type":"ContainerStarted","Data":"c27ee7a03e6ceffc1bf5839dc6071c0df6e83952675f6d8020ed747c5d76f19d"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.841597 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" event={"ID":"c0b12a3c-36ae-410a-9b32-71777ada78a8","Type":"ContainerStarted","Data":"ae2c8980de55eebab5e9b02a9a1f1abe433e1c4ca734b8726b2d941464e38187"} Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.888990 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.889373 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.389335391 +0000 UTC m=+144.337633425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.934299 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.937035 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.944731 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54"] Sep 29 18:45:43 crc kubenswrapper[4780]: I0929 18:45:43.991430 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:43 crc kubenswrapper[4780]: E0929 18:45:43.991818 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.491804365 +0000 UTC m=+144.440102409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.041432 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nhjbl"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.065275 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-slln5"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.092371 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:44 crc kubenswrapper[4780]: E0929 18:45:44.092773 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.592749679 +0000 UTC m=+144.541047733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.108039 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.201152 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:44 crc kubenswrapper[4780]: E0929 18:45:44.201933 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.701913535 +0000 UTC m=+144.650211579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.297987 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.306071 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:44 crc kubenswrapper[4780]: E0929 18:45:44.306743 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.806720377 +0000 UTC m=+144.755018421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.327182 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gmjd6"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.356352 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.366228 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.409611 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:44 crc kubenswrapper[4780]: E0929 18:45:44.409957 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:44.909944406 +0000 UTC m=+144.858242450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.510774 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:44 crc kubenswrapper[4780]: E0929 18:45:44.511739 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.011712817 +0000 UTC m=+144.960010861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.544422 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.573678 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tcnj7"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.608269 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.612844 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.612941 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-77bv2"] Sep 29 18:45:44 crc kubenswrapper[4780]: E0929 18:45:44.613327 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.113307582 +0000 UTC m=+145.061605626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: W0929 18:45:44.680388 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ac77a9_1c0d_4613_a093_98c52157eb53.slice/crio-a51aef26a72ee299381e3c75cb9ae0f4482a7b7f7f69171e63db2abdc752cc76 WatchSource:0}: Error finding container a51aef26a72ee299381e3c75cb9ae0f4482a7b7f7f69171e63db2abdc752cc76: Status 404 returned error can't find the container with id a51aef26a72ee299381e3c75cb9ae0f4482a7b7f7f69171e63db2abdc752cc76 Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.713913 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:44 crc kubenswrapper[4780]: E0929 18:45:44.714433 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.214386309 +0000 UTC m=+145.162684353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.785439 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.788597 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.824391 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-rqptt"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.836168 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:44 crc kubenswrapper[4780]: E0929 18:45:44.836703 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.33668671 +0000 UTC m=+145.284984754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.850824 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tcnj7" event={"ID":"44fa4e91-9dc5-4eec-91bd-482541249e47","Type":"ContainerStarted","Data":"74fc7098682ec124662049609bb64147ff42ec2dbc695ae7fcfebec18c434290"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.856194 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" event={"ID":"25b697fa-b2da-4f7b-9f70-df23ef21caef","Type":"ContainerStarted","Data":"45ef9c3a3085274c294bf49ec474ee870a4c9d39ec6ba7405b75bf391ded704f"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.858929 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" event={"ID":"c0b12a3c-36ae-410a-9b32-71777ada78a8","Type":"ContainerStarted","Data":"5dcc37d76d9713bbfc257c2499b48679c592ac4136bfaacfaff729e3d755b818"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.867984 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.893430 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.894575 4780 generic.go:334] "Generic (PLEG): container finished" podID="8d94c958-7aee-4529-9d1d-a961fe232f9b" containerID="47bf36a2eb1d9a97031898a7fdb6fb4af2858dff9380920f8dc6c5280a027107" exitCode=0 Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.894680 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" event={"ID":"8d94c958-7aee-4529-9d1d-a961fe232f9b","Type":"ContainerDied","Data":"47bf36a2eb1d9a97031898a7fdb6fb4af2858dff9380920f8dc6c5280a027107"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.899323 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8n855"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.899360 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" event={"ID":"0c169409-7ddd-4961-b837-847550878691","Type":"ContainerStarted","Data":"0184775bd630cf3b99dcc67f3be5c1a210cc83e548366eadd3dc0f66ae305df9"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.902380 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.903625 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2ttt" podStartSLOduration=123.903614497 podStartE2EDuration="2m3.903614497s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:44.870166359 +0000 UTC m=+144.818464403" watchObservedRunningTime="2025-09-29 18:45:44.903614497 +0000 UTC m=+144.851912541" Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.904761 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" event={"ID":"78d8c0c3-f516-48ef-8279-a2e9e0c04835","Type":"ContainerStarted","Data":"505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.906340 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79"] Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.906880 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" event={"ID":"78d8c0c3-f516-48ef-8279-a2e9e0c04835","Type":"ContainerStarted","Data":"bc23b42711602542d1413450ef02684a04b26267e48acd638f0ce028f4cb3772"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.906925 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.909122 4780 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-slln5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.909521 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" podUID="78d8c0c3-f516-48ef-8279-a2e9e0c04835" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.910899 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" event={"ID":"a5df5c15-d545-48e6-8d2c-d46b39b6b705","Type":"ContainerStarted","Data":"135116e7d4d283d19fa3d94814ebfa8da1fa6b912b9befad9ebf244ae3e76397"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.912727 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc" event={"ID":"5d7b3f63-b077-4b71-82cb-9441ffda1b74","Type":"ContainerStarted","Data":"2dd6f3616de48a6ab2ff02502fe15a765a5e3ff3752559a4a8d629d4fe14b7fa"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.912762 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc" event={"ID":"5d7b3f63-b077-4b71-82cb-9441ffda1b74","Type":"ContainerStarted","Data":"7aa5947911275faebe27c74ca90c1419499333ffb61709a04553c6b65a7f1e05"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.915740 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" podStartSLOduration=123.915728458 podStartE2EDuration="2m3.915728458s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:44.914605381 +0000 UTC m=+144.862903445" watchObservedRunningTime="2025-09-29 18:45:44.915728458 +0000 UTC m=+144.864026502" Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.920860 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" event={"ID":"a24d0b10-049f-4082-84c2-06a42e8fa4d9","Type":"ContainerStarted","Data":"b3e6594454c0b19b526b2bf6800263d5b30a9b540fa99191394d834b49f48be7"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.925378 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" event={"ID":"482b8d66-a8d0-4d21-ba06-6f818f092ea7","Type":"ContainerStarted","Data":"f69bc3d24231e94ded3d05f44bd33eefd8d624c8cb5bc2dd5ae850bdcad2b12e"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.929211 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" event={"ID":"c1ac77a9-1c0d-4613-a093-98c52157eb53","Type":"ContainerStarted","Data":"a51aef26a72ee299381e3c75cb9ae0f4482a7b7f7f69171e63db2abdc752cc76"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.932450 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7v67w" event={"ID":"d5a6b98f-17b6-4e3c-aa64-9b05b9d23547","Type":"ContainerStarted","Data":"15a510b7c283e52d0746d5f6a8c1bc98c481bffb6c0d66e9ee89a3ec783e76ed"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.932707 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7v67w" Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.944566 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:44 crc kubenswrapper[4780]: E0929 18:45:44.948023 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.447983417 +0000 UTC m=+145.396281461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.951794 4780 patch_prober.go:28] interesting pod/downloads-7954f5f757-7v67w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.951844 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" event={"ID":"7f58a165-7d7b-435e-bea2-33d2e8498a1f","Type":"ContainerStarted","Data":"964c59ee7678b563e1ee3063f90bd9d413f897f120f547c6fc412bf4f4d03efa"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.951866 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7v67w" podUID="d5a6b98f-17b6-4e3c-aa64-9b05b9d23547" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.968803 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9mx5m" event={"ID":"b68f3b8f-359e-4bfc-a968-f090b2960ee9","Type":"ContainerStarted","Data":"d9dbf3056c389ee180b35d77a7b17c6461e1fe4809816fd443ab5332862a55d5"} Sep 29 18:45:44 crc kubenswrapper[4780]: W0929 18:45:44.977027 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23ba5721_4137_453c_96f7_54a7b3de5902.slice/crio-63aecbd2b0bd9e31585ddf78f15e77a3c617996d614506a1920b09e365b62ace WatchSource:0}: Error finding container 63aecbd2b0bd9e31585ddf78f15e77a3c617996d614506a1920b09e365b62ace: Status 404 returned error can't find the container with id 63aecbd2b0bd9e31585ddf78f15e77a3c617996d614506a1920b09e365b62ace Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.979582 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" event={"ID":"13048f18-9e41-4649-a688-311a34f74222","Type":"ContainerStarted","Data":"41cef6e1beacc3690b315606c32ea002b4377c75ec4b2ad7f5e8c5c45398a0a3"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.988366 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-776sf" event={"ID":"0052df5f-706f-4dc9-b03e-dbd98d090fb3","Type":"ContainerStarted","Data":"1cf6daef74cfb4b7113fef925331e8fefc22afaf76f38a9703b2eb8dd54be43b"} Sep 29 18:45:44 crc kubenswrapper[4780]: I0929 18:45:44.991232 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vgx7v" event={"ID":"26a049a2-59dd-4762-ac58-9eb88fc892a4","Type":"ContainerStarted","Data":"8fccf58862dc5a0c7cb3faaf813caa87f3207b05ae853258edae9eaf50903ce0"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.002072 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" event={"ID":"140efc80-e021-46f1-8878-4f018e4f33b7","Type":"ContainerStarted","Data":"abda9f078ccaeb2a71d4009a386e6e7904dcee706798a1bf12dc14305018d2e6"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.007476 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" event={"ID":"42152318-f79e-44b4-b9ce-0a9b29eddfff","Type":"ContainerStarted","Data":"6d7fb9e4573a6cef4ccafa0952192fa072d72e2bc836aa56df26cdc24949faef"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.017020 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.018846 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" event={"ID":"0761f136-d154-4872-86b1-04658a564728","Type":"ContainerStarted","Data":"93361b9ccb33e1750061b82d8e687ea7e40b0184ea5e996687eed6e5917b83fa"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.020499 4780 patch_prober.go:28] interesting pod/router-default-5444994796-vgx7v container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.020566 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vgx7v" podUID="26a049a2-59dd-4762-ac58-9eb88fc892a4" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.026461 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-prx87" event={"ID":"e7d0d686-bf59-426e-a1d4-99af3f38162f","Type":"ContainerStarted","Data":"074046b0d2574869f4de1446e79ae2ec34ae48045a407a1f5b9db24f47055313"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.028428 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" podStartSLOduration=123.02839931 podStartE2EDuration="2m3.02839931s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.001801209 +0000 UTC m=+144.950099253" watchObservedRunningTime="2025-09-29 18:45:45.02839931 +0000 UTC m=+144.976697354" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.028874 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.030586 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" podStartSLOduration=45.030576003 podStartE2EDuration="45.030576003s" podCreationTimestamp="2025-09-29 18:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.030016854 +0000 UTC m=+144.978314898" watchObservedRunningTime="2025-09-29 18:45:45.030576003 +0000 UTC m=+144.978874047" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.033678 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" event={"ID":"8f7506da-aefc-4178-b6a2-408e686c8040","Type":"ContainerStarted","Data":"a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.034883 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.049971 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" event={"ID":"abd4d246-d458-4255-8edb-f043e8633560","Type":"ContainerStarted","Data":"f8e62d9e54c677fb7dffaaf22cb603a34e4be9f6f856fa7c69f4cc592c5e0269"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.050516 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" event={"ID":"abd4d246-d458-4255-8edb-f043e8633560","Type":"ContainerStarted","Data":"10ffaf33c1187f4e7c1f671a88587e75440626484826975a4bbea96fcf3e965b"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.055548 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.056473 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.556448639 +0000 UTC m=+145.504746683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.058463 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" event={"ID":"d28ea348-5f33-4484-8039-4cba3bb89234","Type":"ContainerStarted","Data":"0c4ada25040a6db6753257269d081bb37cd686ca262d8634ed7bf1e90633d059"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.058626 4780 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-w8l7s container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.058718 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" podUID="8f7506da-aefc-4178-b6a2-408e686c8040" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.062709 4780 patch_prober.go:28] interesting pod/console-operator-58897d9998-prx87 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.062764 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-prx87" podUID="e7d0d686-bf59-426e-a1d4-99af3f38162f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.062908 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-77bv2" event={"ID":"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672","Type":"ContainerStarted","Data":"c8c0ed1a73c68ff2393b25aac1032e01a6a2f16234a9b1d2b5d51c4626b7c410"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.071862 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" event={"ID":"51e4222f-7fd5-41eb-afcc-832602668ada","Type":"ContainerStarted","Data":"47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.079015 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.111241 4780 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vr2qc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.111300 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" podUID="51e4222f-7fd5-41eb-afcc-832602668ada" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.114953 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" event={"ID":"268fec05-58bc-4843-b4c5-4bcc6d9cb8d0","Type":"ContainerStarted","Data":"dd22476ab8a25e0412bf11cb47a83066cfbf8f5b6344985005d23647405663ce"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.134351 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9m8qp" event={"ID":"2db80d2a-2621-41ca-a35f-89640799690b","Type":"ContainerStarted","Data":"fbbd93f1162707fb7baeab4668b0f4c4675448fc40b9af644bccf024a26e9c75"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.134688 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9m8qp" event={"ID":"2db80d2a-2621-41ca-a35f-89640799690b","Type":"ContainerStarted","Data":"1535f72fa6f8f1ae4b5413762dc9c1b561ab9da0d9461ed9b1a28eeba7449341"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.156476 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7v67w" podStartSLOduration=124.156451242 podStartE2EDuration="2m4.156451242s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.115410273 +0000 UTC m=+145.063708317" watchObservedRunningTime="2025-09-29 18:45:45.156451242 +0000 UTC m=+145.104749286" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.160568 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.160760 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.660729344 +0000 UTC m=+145.609027388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.161546 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.166093 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.666075411 +0000 UTC m=+145.614373455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.167396 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vgx7v" podStartSLOduration=123.167368704 podStartE2EDuration="2m3.167368704s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.153656609 +0000 UTC m=+145.101954653" watchObservedRunningTime="2025-09-29 18:45:45.167368704 +0000 UTC m=+145.115666748" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.168311 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" event={"ID":"d13c35f8-dc86-4234-b91d-7e9130419e19","Type":"ContainerStarted","Data":"3dbc8d0974523fae55ff402dff1934b1101e1601dc9c732b30477a2115831465"} Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.186267 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.190518 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8zct4" podStartSLOduration=124.19049399 podStartE2EDuration="2m4.19049399s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.189092413 +0000 UTC m=+145.137390487" watchObservedRunningTime="2025-09-29 18:45:45.19049399 +0000 UTC m=+145.138792034" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.238452 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" podStartSLOduration=124.238432817 podStartE2EDuration="2m4.238432817s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.232307975 +0000 UTC m=+145.180606019" watchObservedRunningTime="2025-09-29 18:45:45.238432817 +0000 UTC m=+145.186730861" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.314172 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.317245 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.817217217 +0000 UTC m=+145.765515251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.341587 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-prx87" podStartSLOduration=124.341546943 podStartE2EDuration="2m4.341546943s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.337567041 +0000 UTC m=+145.285865105" watchObservedRunningTime="2025-09-29 18:45:45.341546943 +0000 UTC m=+145.289844987" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.342321 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9mx5m" podStartSLOduration=5.342307878 podStartE2EDuration="5.342307878s" podCreationTimestamp="2025-09-29 18:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.306281985 +0000 UTC m=+145.254580039" watchObservedRunningTime="2025-09-29 18:45:45.342307878 +0000 UTC m=+145.290605922" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.371375 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" podStartSLOduration=123.37135451 podStartE2EDuration="2m3.37135451s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.370227553 +0000 UTC m=+145.318525597" watchObservedRunningTime="2025-09-29 18:45:45.37135451 +0000 UTC m=+145.319652554" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.409978 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" podStartSLOduration=123.409945959 podStartE2EDuration="2m3.409945959s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.400645661 +0000 UTC m=+145.348943705" watchObservedRunningTime="2025-09-29 18:45:45.409945959 +0000 UTC m=+145.358244003" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.415876 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.416357 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:45.91633455 +0000 UTC m=+145.864632584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.449788 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8jj99" podStartSLOduration=124.449759557 podStartE2EDuration="2m4.449759557s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.436352053 +0000 UTC m=+145.384650117" watchObservedRunningTime="2025-09-29 18:45:45.449759557 +0000 UTC m=+145.398057601" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.491251 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62klh" podStartSLOduration=124.491227041 podStartE2EDuration="2m4.491227041s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.484432036 +0000 UTC m=+145.432730090" watchObservedRunningTime="2025-09-29 18:45:45.491227041 +0000 UTC m=+145.439525085" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.516804 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.518452 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.018427822 +0000 UTC m=+145.966725876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.521131 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9m8qp" podStartSLOduration=5.52108583 podStartE2EDuration="5.52108583s" podCreationTimestamp="2025-09-29 18:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.511568715 +0000 UTC m=+145.459866759" watchObservedRunningTime="2025-09-29 18:45:45.52108583 +0000 UTC m=+145.469383874" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.559356 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" podStartSLOduration=123.559334257 podStartE2EDuration="2m3.559334257s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:45.549686647 +0000 UTC m=+145.497984691" watchObservedRunningTime="2025-09-29 18:45:45.559334257 +0000 UTC m=+145.507632301" Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.620142 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.620531 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.120517964 +0000 UTC m=+146.068816008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.721198 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.721694 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.221675594 +0000 UTC m=+146.169973638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.822899 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.823846 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.323832148 +0000 UTC m=+146.272130192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:45 crc kubenswrapper[4780]: I0929 18:45:45.925559 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:45 crc kubenswrapper[4780]: E0929 18:45:45.926174 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.426151577 +0000 UTC m=+146.374449621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.022569 4780 patch_prober.go:28] interesting pod/router-default-5444994796-vgx7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 18:45:46 crc kubenswrapper[4780]: [-]has-synced failed: reason withheld Sep 29 18:45:46 crc kubenswrapper[4780]: [+]process-running ok Sep 29 18:45:46 crc kubenswrapper[4780]: healthz check failed Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.023036 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vgx7v" podUID="26a049a2-59dd-4762-ac58-9eb88fc892a4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.027579 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.028090 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.528063313 +0000 UTC m=+146.476361357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.131491 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.131719 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.631685045 +0000 UTC m=+146.579983079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.133877 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.134748 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.634728166 +0000 UTC m=+146.583026210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.200058 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" event={"ID":"a5df5c15-d545-48e6-8d2c-d46b39b6b705","Type":"ContainerStarted","Data":"4ccafcc67c4a20d2cbf0c7294c9914aa90a655008e571c60210eb45c27dc80be"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.219365 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc" event={"ID":"5d7b3f63-b077-4b71-82cb-9441ffda1b74","Type":"ContainerStarted","Data":"caea1bdcfad5a68b6191d5c898740c67e8b282eb6158200494d8b80de884faa0"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.237144 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" event={"ID":"83670b30-2222-428b-b4cc-17d16e0bedb2","Type":"ContainerStarted","Data":"5cfc36dcef9047a118578dae247f250c4e4dc5bd6f8d67299dee5e136c935d9c"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.237774 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.238737 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.738709771 +0000 UTC m=+146.687007815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.256087 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" event={"ID":"7f58a165-7d7b-435e-bea2-33d2e8498a1f","Type":"ContainerStarted","Data":"998279401ad2eeda89dfed3369a4fed9c8fe4ef7069aed5b9c26203a8826eec5"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.265618 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" event={"ID":"23ba5721-4137-453c-96f7-54a7b3de5902","Type":"ContainerStarted","Data":"8acfdbe43a2eded779c334d16663dbe8b0ace7a62503821a73149c7158ec0575"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.265685 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" event={"ID":"23ba5721-4137-453c-96f7-54a7b3de5902","Type":"ContainerStarted","Data":"63aecbd2b0bd9e31585ddf78f15e77a3c617996d614506a1920b09e365b62ace"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.270286 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nhjbl" podStartSLOduration=125.270253335 podStartE2EDuration="2m5.270253335s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.269825841 +0000 UTC m=+146.218123905" watchObservedRunningTime="2025-09-29 18:45:46.270253335 +0000 UTC m=+146.218551379" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.276572 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" event={"ID":"42152318-f79e-44b4-b9ce-0a9b29eddfff","Type":"ContainerStarted","Data":"8cf78127e7ecbcc9cdf946c94ef7a662e191bf4e1753827fa690eeeb0f1e5dd0"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.277866 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.284347 4780 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fg26g container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.284449 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" podUID="42152318-f79e-44b4-b9ce-0a9b29eddfff" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.303842 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" event={"ID":"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a","Type":"ContainerStarted","Data":"2168c77453bdea4d485003eb78023310e8b444052b8b995bcb337540fe1fabc1"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.303960 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" event={"ID":"7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a","Type":"ContainerStarted","Data":"24039ec6daeed12cc93c1018155f9d9a8ad11c8f317a1d13cfa13ffc18a0ecfb"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.341035 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.343600 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.843580044 +0000 UTC m=+146.791878098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.343700 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fxr6n" podStartSLOduration=124.343677517 podStartE2EDuration="2m4.343677517s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.339246651 +0000 UTC m=+146.287544695" watchObservedRunningTime="2025-09-29 18:45:46.343677517 +0000 UTC m=+146.291975561" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.354076 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" event={"ID":"7b540596-75a7-4dd2-9466-758942da4d0d","Type":"ContainerStarted","Data":"cc75eeddc07615d1fe7f78c3e4ae10c8016f4519b9343fb83bb586b633bc310c"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.374760 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" event={"ID":"c1ac77a9-1c0d-4613-a093-98c52157eb53","Type":"ContainerStarted","Data":"65ae5d225cbfb91ba16bf442a131fcb8927b93fcf1305df866707030941be664"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.388663 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mnkdc" podStartSLOduration=124.388636807 podStartE2EDuration="2m4.388636807s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.387169308 +0000 UTC m=+146.335467362" watchObservedRunningTime="2025-09-29 18:45:46.388636807 +0000 UTC m=+146.336934851" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.426503 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8t9w9" event={"ID":"d13c35f8-dc86-4234-b91d-7e9130419e19","Type":"ContainerStarted","Data":"07e4edf33c423e8c751a6d1daade8f2f6b64a2867ac7d2eebc6499c058a2ac84"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.432558 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ndvdh" podStartSLOduration=124.43252744 podStartE2EDuration="2m4.43252744s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.418489866 +0000 UTC m=+146.366787910" watchObservedRunningTime="2025-09-29 18:45:46.43252744 +0000 UTC m=+146.380825484" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.444120 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.457186 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.957154226 +0000 UTC m=+146.905452270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.457591 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:46.95758306 +0000 UTC m=+146.905881104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.457301 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.466705 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" event={"ID":"ce5c3243-04a4-4b1d-8800-c10b0b83916d","Type":"ContainerStarted","Data":"26bb4384df42c2e55f6f685a1f71660ce4e8483e44bd52276148778c72e0fea9"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.466755 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" event={"ID":"ce5c3243-04a4-4b1d-8800-c10b0b83916d","Type":"ContainerStarted","Data":"3fde19c1aa96f56905744fd5eed447b4f48749c84adaa434becb12e8b03d34c7"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.497531 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" event={"ID":"be97fd99-dd61-4c66-a928-095939f74649","Type":"ContainerStarted","Data":"fcee2d5f18baf847603040f6b7f4b276488a0e41c1bf759de48409ae35f2a8d0"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.497594 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" event={"ID":"be97fd99-dd61-4c66-a928-095939f74649","Type":"ContainerStarted","Data":"a279ebc027cf5b22f5c417122e8f9cf2c83916a58673cdb38bc359fda288cc3b"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.498148 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.507209 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z4r54" event={"ID":"d28ea348-5f33-4484-8039-4cba3bb89234","Type":"ContainerStarted","Data":"87a54fd3a445b24a780da8e2f260b1f12768527ad2d25a6293e5e14df3f0ef37"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.516372 4780 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-t7zv9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" start-of-body= Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.516458 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" podUID="be97fd99-dd61-4c66-a928-095939f74649" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.521425 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" event={"ID":"c0b12a3c-36ae-410a-9b32-71777ada78a8","Type":"ContainerStarted","Data":"378a592dbbe7d10ae0e9e903edcb196f2f43baec298d6447523aacf097bd94a8"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.540180 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" event={"ID":"0c169409-7ddd-4961-b837-847550878691","Type":"ContainerStarted","Data":"f6e32c7962d305b670fe9d0a50d488df362a859289d96ddee6b4019f490035b0"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.559319 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" podStartSLOduration=125.559292809 podStartE2EDuration="2m5.559292809s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.486632323 +0000 UTC m=+146.434930367" watchObservedRunningTime="2025-09-29 18:45:46.559292809 +0000 UTC m=+146.507590853" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.560589 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" podStartSLOduration=124.560582512 podStartE2EDuration="2m4.560582512s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.559725884 +0000 UTC m=+146.508023928" watchObservedRunningTime="2025-09-29 18:45:46.560582512 +0000 UTC m=+146.508880556" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.560870 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.561636 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.061614126 +0000 UTC m=+147.009912160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.562548 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" event={"ID":"140efc80-e021-46f1-8878-4f018e4f33b7","Type":"ContainerStarted","Data":"9ba88971ccb48f41599017edc4932e30b13f54f6b4a400ee373b6e79da97a31c"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.563466 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.569573 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.069536749 +0000 UTC m=+147.017834793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.603963 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" event={"ID":"25b697fa-b2da-4f7b-9f70-df23ef21caef","Type":"ContainerStarted","Data":"5e4e93ad3e69b94f1ee59a865ded46aa2940d16f0402add5da2e858c2b9ff00e"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.668144 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" podStartSLOduration=124.668111534 podStartE2EDuration="2m4.668111534s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.641287915 +0000 UTC m=+146.589585979" watchObservedRunningTime="2025-09-29 18:45:46.668111534 +0000 UTC m=+146.616409578" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.671269 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.672904 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.172883032 +0000 UTC m=+147.121181386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.677382 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qc2c7" podStartSLOduration=125.67736828 podStartE2EDuration="2m5.67736828s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.675647624 +0000 UTC m=+146.623945668" watchObservedRunningTime="2025-09-29 18:45:46.67736828 +0000 UTC m=+146.625666324" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.724350 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" event={"ID":"8d94c958-7aee-4529-9d1d-a961fe232f9b","Type":"ContainerStarted","Data":"742fa5d43effa470779589fe3cb464edc0a86d8d53886b449f26c83eb3428c7b"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.726548 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.773070 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.774905 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.274892011 +0000 UTC m=+147.223190055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.776069 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gmjd6" podStartSLOduration=124.776031849 podStartE2EDuration="2m4.776031849s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.727721668 +0000 UTC m=+146.676019712" watchObservedRunningTime="2025-09-29 18:45:46.776031849 +0000 UTC m=+146.724329893" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.776467 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vr4gw" podStartSLOduration=125.776462353 podStartE2EDuration="2m5.776462353s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.775456359 +0000 UTC m=+146.723754403" watchObservedRunningTime="2025-09-29 18:45:46.776462353 +0000 UTC m=+146.724760387" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.785724 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" event={"ID":"ea0b5635-a763-45f8-9529-b57c33c0bef3","Type":"ContainerStarted","Data":"3d3892da4714b5d0c128f00ccefb07ad8bc001b802c44ee6450d001c44001585"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.785769 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" event={"ID":"ea0b5635-a763-45f8-9529-b57c33c0bef3","Type":"ContainerStarted","Data":"ea24fb687b03e8bfb3a511c79f5e51e294f5a2f6b237bb38e01c311a5b693bb2"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.785778 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" event={"ID":"ea0b5635-a763-45f8-9529-b57c33c0bef3","Type":"ContainerStarted","Data":"0ad8f55fe7335c8a3458e54af415ee59798bf2db8643e4e8e058b24aae9b0f06"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.785790 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" event={"ID":"2405452c-cfe6-4a52-b1f6-8e5eff36bddf","Type":"ContainerStarted","Data":"7666c05abb1148ceafa3ce4efe4a14da771fffde976c703f62b30f75ae8bbda5"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.785803 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" event={"ID":"2405452c-cfe6-4a52-b1f6-8e5eff36bddf","Type":"ContainerStarted","Data":"cc550bfbde15d430dbdc96c320eccce58ed873c12656176e450d7bdfff34dcb8"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.856398 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" event={"ID":"7f434612-b9f0-48bd-90e9-a8c7c7385466","Type":"ContainerStarted","Data":"1a4701a975e1d8edb5411f019e25b8e7b32f78cbb31a77f08b010250182cfeda"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.857189 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.859374 4780 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-p2p79 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.859420 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" podUID="7f434612-b9f0-48bd-90e9-a8c7c7385466" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.884416 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.886218 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.386192508 +0000 UTC m=+147.334490562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.908380 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" podStartSLOduration=124.908360072 podStartE2EDuration="2m4.908360072s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.842975796 +0000 UTC m=+146.791273840" watchObservedRunningTime="2025-09-29 18:45:46.908360072 +0000 UTC m=+146.856658116" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.909192 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-d4z2z" podStartSLOduration=124.909183419 podStartE2EDuration="2m4.909183419s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.897881115 +0000 UTC m=+146.846179169" watchObservedRunningTime="2025-09-29 18:45:46.909183419 +0000 UTC m=+146.857481463" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.916606 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" event={"ID":"0761f136-d154-4872-86b1-04658a564728","Type":"ContainerStarted","Data":"303f4a7ef2478ba56c4e70b51b4ed97c3bdee212950f91e646b6bf884e378bbf"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.918087 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.953260 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" event={"ID":"9c0a510a-24d6-46ba-8ff7-4682156a908d","Type":"ContainerStarted","Data":"13b819e3cdd195613c571c16a11f307d4bad62d0eea7b63ef4429d0dff25b447"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.953313 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" event={"ID":"9c0a510a-24d6-46ba-8ff7-4682156a908d","Type":"ContainerStarted","Data":"002099c8d1e20db82f783fbbb17a15953d2f2fb9803175fb54708d4b8a9d16bb"} Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.987572 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.987617 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.988286 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:46 crc kubenswrapper[4780]: E0929 18:45:46.989397 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.489375345 +0000 UTC m=+147.437673389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:46 crc kubenswrapper[4780]: I0929 18:45:46.990941 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tcnj7" event={"ID":"44fa4e91-9dc5-4eec-91bd-482541249e47","Type":"ContainerStarted","Data":"7ce1b609fd6e8728549d608c6490393858e5e26e6f329108e6c830c0e58d804a"} Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.001409 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" podStartSLOduration=126.001387713 podStartE2EDuration="2m6.001387713s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:46.971159542 +0000 UTC m=+146.919457586" watchObservedRunningTime="2025-09-29 18:45:47.001387713 +0000 UTC m=+146.949685757" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.002509 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-rqptt" podStartSLOduration=125.00250167 podStartE2EDuration="2m5.00250167s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:47.000484483 +0000 UTC m=+146.948782527" watchObservedRunningTime="2025-09-29 18:45:47.00250167 +0000 UTC m=+146.950799714" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.027394 4780 patch_prober.go:28] interesting pod/router-default-5444994796-vgx7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 18:45:47 crc kubenswrapper[4780]: [-]has-synced failed: reason withheld Sep 29 18:45:47 crc kubenswrapper[4780]: [+]process-running ok Sep 29 18:45:47 crc kubenswrapper[4780]: healthz check failed Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.027463 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vgx7v" podUID="26a049a2-59dd-4762-ac58-9eb88fc892a4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.027551 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" event={"ID":"a24d0b10-049f-4082-84c2-06a42e8fa4d9","Type":"ContainerStarted","Data":"91abfe071c43a1b6dd4e4a360dc84a1734e4053e719303fecae013a9ac1f0a22"} Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.049581 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-77bv2" event={"ID":"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672","Type":"ContainerStarted","Data":"a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7"} Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.052313 4780 patch_prober.go:28] interesting pod/downloads-7954f5f757-7v67w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.052382 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7v67w" podUID="d5a6b98f-17b6-4e3c-aa64-9b05b9d23547" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.052603 4780 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-slln5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.052672 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" podUID="78d8c0c3-f516-48ef-8279-a2e9e0c04835" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.085171 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.085874 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.089187 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.090069 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" podStartSLOduration=125.090034039 podStartE2EDuration="2m5.090034039s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:47.054503863 +0000 UTC m=+147.002801907" watchObservedRunningTime="2025-09-29 18:45:47.090034039 +0000 UTC m=+147.038332083" Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.091008 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.590979941 +0000 UTC m=+147.539277985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.091110 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4zqqt" podStartSLOduration=125.091102315 podStartE2EDuration="2m5.091102315s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:47.088426296 +0000 UTC m=+147.036724340" watchObservedRunningTime="2025-09-29 18:45:47.091102315 +0000 UTC m=+147.039400359" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.120459 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gpwlq" podStartSLOduration=126.120408896 podStartE2EDuration="2m6.120408896s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:47.117866191 +0000 UTC m=+147.066164235" watchObservedRunningTime="2025-09-29 18:45:47.120408896 +0000 UTC m=+147.068706930" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.122849 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.122993 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.154798 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-77bv2" podStartSLOduration=126.154772294 podStartE2EDuration="2m6.154772294s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:47.15405646 +0000 UTC m=+147.102354504" watchObservedRunningTime="2025-09-29 18:45:47.154772294 +0000 UTC m=+147.103070338" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.175137 4780 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gj4p8 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.21:8443/livez\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.175197 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" podUID="7e9e79c7-e707-4cc7-8d48-1cd44bf44f5a" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.21:8443/livez\": dial tcp 10.217.0.21:8443: connect: connection refused" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.192174 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.196922 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.696903649 +0000 UTC m=+147.645201693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.248432 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" podStartSLOduration=125.248407735 podStartE2EDuration="2m5.248407735s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:47.214558664 +0000 UTC m=+147.162856708" watchObservedRunningTime="2025-09-29 18:45:47.248407735 +0000 UTC m=+147.196705769" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.304448 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.305128 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.805108684 +0000 UTC m=+147.753406728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.314186 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" podStartSLOduration=125.300026835 podStartE2EDuration="2m5.300026835s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:47.253446932 +0000 UTC m=+147.201744976" watchObservedRunningTime="2025-09-29 18:45:47.300026835 +0000 UTC m=+147.248324879" Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.406333 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.406983 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:47.906968038 +0000 UTC m=+147.855266082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.507783 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.508025 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.007988314 +0000 UTC m=+147.956286358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.508236 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.508586 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.008571613 +0000 UTC m=+147.956869657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.609932 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.610210 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.110173119 +0000 UTC m=+148.058471173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.610293 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.610803 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.110789119 +0000 UTC m=+148.059087163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.713469 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.713718 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.213679337 +0000 UTC m=+148.161977381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.713890 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.714333 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.214323248 +0000 UTC m=+148.162621292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.814961 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.815364 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.315346505 +0000 UTC m=+148.263644549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:47 crc kubenswrapper[4780]: I0929 18:45:47.916589 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:47 crc kubenswrapper[4780]: E0929 18:45:47.917173 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.41694822 +0000 UTC m=+148.365246264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.017557 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.017965 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.517942905 +0000 UTC m=+148.466240949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.018592 4780 patch_prober.go:28] interesting pod/router-default-5444994796-vgx7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 18:45:48 crc kubenswrapper[4780]: [-]has-synced failed: reason withheld Sep 29 18:45:48 crc kubenswrapper[4780]: [+]process-running ok Sep 29 18:45:48 crc kubenswrapper[4780]: healthz check failed Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.018646 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vgx7v" podUID="26a049a2-59dd-4762-ac58-9eb88fc892a4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.020893 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.051411 4780 patch_prober.go:28] interesting pod/console-operator-58897d9998-prx87 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.051510 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-prx87" podUID="e7d0d686-bf59-426e-a1d4-99af3f38162f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.055287 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" event={"ID":"0761f136-d154-4872-86b1-04658a564728","Type":"ContainerStarted","Data":"c976a740502b8f03d55b0012543c990e87ce65864614176420dfabe4f784e33e"} Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.058782 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" event={"ID":"23ba5721-4137-453c-96f7-54a7b3de5902","Type":"ContainerStarted","Data":"621278a44be4a991a0afc3e4e2525687b5af933dcb187f13a8a09415c149c48f"} Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.062763 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" event={"ID":"ce5c3243-04a4-4b1d-8800-c10b0b83916d","Type":"ContainerStarted","Data":"7ebac54da99ff7246b47306af7578409dbcece36593c6e241eb8a4b623a6dd0e"} Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.064866 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-776sf" event={"ID":"0052df5f-706f-4dc9-b03e-dbd98d090fb3","Type":"ContainerStarted","Data":"e251ca39ea1a965c05b4674c4615271267da05b8be983fde6039cfd8aaace452"} Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.067676 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tcnj7" event={"ID":"44fa4e91-9dc5-4eec-91bd-482541249e47","Type":"ContainerStarted","Data":"d248e74f657b0600742b9233f6416cc1ec2205288647b4e4c1102939bd6dd952"} Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.067955 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-tcnj7" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.070173 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2khnm" event={"ID":"a24d0b10-049f-4082-84c2-06a42e8fa4d9","Type":"ContainerStarted","Data":"22464d3533ef72454b0e450b874ee174b26693308a4f8592f49ae2068973d515"} Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.071946 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" event={"ID":"7f434612-b9f0-48bd-90e9-a8c7c7385466","Type":"ContainerStarted","Data":"20c089f63ebdfbd3e814f6abf59a2ac04ede31f1141627570a5ee68b4a736c76"} Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.075315 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" event={"ID":"83670b30-2222-428b-b4cc-17d16e0bedb2","Type":"ContainerStarted","Data":"9dc4a4236b629788a1b3ffe72b3ef909047d7760b9a366e4f1dfd551b174a040"} Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.075570 4780 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-slln5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.075630 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" podUID="78d8c0c3-f516-48ef-8279-a2e9e0c04835" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.095568 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fg26g" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.098649 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-p2p79" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.106005 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qnzcj" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.115882 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7zgvf" podStartSLOduration=126.115856089 podStartE2EDuration="2m6.115856089s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:48.111347749 +0000 UTC m=+148.059645793" watchObservedRunningTime="2025-09-29 18:45:48.115856089 +0000 UTC m=+148.064154143" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.119287 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.123272 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.623247183 +0000 UTC m=+148.571545227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.221013 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.221281 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.721231538 +0000 UTC m=+148.669529592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.221844 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.224805 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.724785426 +0000 UTC m=+148.673083470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.320657 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mkxzc" podStartSLOduration=126.320629591 podStartE2EDuration="2m6.320629591s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:48.255782283 +0000 UTC m=+148.204080327" watchObservedRunningTime="2025-09-29 18:45:48.320629591 +0000 UTC m=+148.268927635" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.323924 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.324641 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.824623233 +0000 UTC m=+148.772921277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.441923 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.442823 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:48.942801247 +0000 UTC m=+148.891099291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.462649 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8n855" podStartSLOduration=126.462613684 podStartE2EDuration="2m6.462613684s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:48.462021514 +0000 UTC m=+148.410319558" watchObservedRunningTime="2025-09-29 18:45:48.462613684 +0000 UTC m=+148.410911748" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.537284 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tcnj7" podStartSLOduration=8.537257386 podStartE2EDuration="8.537257386s" podCreationTimestamp="2025-09-29 18:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:48.53678835 +0000 UTC m=+148.485086394" watchObservedRunningTime="2025-09-29 18:45:48.537257386 +0000 UTC m=+148.485555430" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.545649 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.546060 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.046025796 +0000 UTC m=+148.994323840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.649562 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.649640 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.649683 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.649710 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.649751 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.650267 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.150241778 +0000 UTC m=+149.098539832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.667270 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.668445 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.674919 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.682938 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.740705 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qtrcx"] Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.741825 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.750344 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.751266 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.751751 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.25172653 +0000 UTC m=+149.200024564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.770601 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.780464 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.781265 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtrcx"] Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.844637 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-prx87" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.873710 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9mhk\" (UniqueName: \"kubernetes.io/projected/4b71dbf4-1e39-4222-bfb1-ccec82699848-kube-api-access-v9mhk\") pod \"certified-operators-qtrcx\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.873762 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-catalog-content\") pod \"certified-operators-qtrcx\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.873804 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-utilities\") pod \"certified-operators-qtrcx\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.873859 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.874248 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.374235368 +0000 UTC m=+149.322533412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.890813 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.981516 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.981739 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9mhk\" (UniqueName: \"kubernetes.io/projected/4b71dbf4-1e39-4222-bfb1-ccec82699848-kube-api-access-v9mhk\") pod \"certified-operators-qtrcx\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.981767 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-catalog-content\") pod \"certified-operators-qtrcx\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.981801 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-utilities\") pod \"certified-operators-qtrcx\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.982261 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-utilities\") pod \"certified-operators-qtrcx\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:48 crc kubenswrapper[4780]: E0929 18:45:48.982339 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.482321728 +0000 UTC m=+149.430619772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:48 crc kubenswrapper[4780]: I0929 18:45:48.982890 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-catalog-content\") pod \"certified-operators-qtrcx\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.022381 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f2g2b"] Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.023456 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.029335 4780 patch_prober.go:28] interesting pod/router-default-5444994796-vgx7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 18:45:49 crc kubenswrapper[4780]: [-]has-synced failed: reason withheld Sep 29 18:45:49 crc kubenswrapper[4780]: [+]process-running ok Sep 29 18:45:49 crc kubenswrapper[4780]: healthz check failed Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.029388 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vgx7v" podUID="26a049a2-59dd-4762-ac58-9eb88fc892a4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.039659 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.057175 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9mhk\" (UniqueName: \"kubernetes.io/projected/4b71dbf4-1e39-4222-bfb1-ccec82699848-kube-api-access-v9mhk\") pod \"certified-operators-qtrcx\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.078189 4780 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-dnm7s container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.078286 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" podUID="8d94c958-7aee-4529-9d1d-a961fe232f9b" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.078959 4780 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-t7zv9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.078988 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" podUID="be97fd99-dd61-4c66-a928-095939f74649" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.086484 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f2g2b"] Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.087021 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.087967 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.088485 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.588468234 +0000 UTC m=+149.536766278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.143812 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pt7d5"] Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.151270 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.189954 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.190293 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqdqn\" (UniqueName: \"kubernetes.io/projected/3fcaee64-db78-46be-a54e-8412e4394681-kube-api-access-vqdqn\") pod \"community-operators-f2g2b\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.190379 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-catalog-content\") pod \"community-operators-f2g2b\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.190413 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-utilities\") pod \"community-operators-f2g2b\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.190572 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.690550586 +0000 UTC m=+149.638848630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.198136 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pt7d5"] Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.213442 4780 generic.go:334] "Generic (PLEG): container finished" podID="482b8d66-a8d0-4d21-ba06-6f818f092ea7" containerID="f69bc3d24231e94ded3d05f44bd33eefd8d624c8cb5bc2dd5ae850bdcad2b12e" exitCode=0 Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.219109 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" event={"ID":"482b8d66-a8d0-4d21-ba06-6f818f092ea7","Type":"ContainerDied","Data":"f69bc3d24231e94ded3d05f44bd33eefd8d624c8cb5bc2dd5ae850bdcad2b12e"} Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.292985 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqdqn\" (UniqueName: \"kubernetes.io/projected/3fcaee64-db78-46be-a54e-8412e4394681-kube-api-access-vqdqn\") pod \"community-operators-f2g2b\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.293096 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-catalog-content\") pod \"community-operators-f2g2b\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.293129 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-utilities\") pod \"community-operators-f2g2b\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.293163 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.293197 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-utilities\") pod \"certified-operators-pt7d5\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.293224 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldc7m\" (UniqueName: \"kubernetes.io/projected/1e054aba-94dc-4945-b4cd-0b7d01db39c4-kube-api-access-ldc7m\") pod \"certified-operators-pt7d5\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.293255 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-catalog-content\") pod \"certified-operators-pt7d5\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.294096 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-catalog-content\") pod \"community-operators-f2g2b\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.294386 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-utilities\") pod \"community-operators-f2g2b\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.294669 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.794654314 +0000 UTC m=+149.742952358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.298737 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-776sf" event={"ID":"0052df5f-706f-4dc9-b03e-dbd98d090fb3","Type":"ContainerStarted","Data":"4c11d83b85e0e44435243c3a2353c4286b92bd3a4adce1daab859c5e245b61c3"} Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.375841 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnm7s" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.395600 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.395847 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldc7m\" (UniqueName: \"kubernetes.io/projected/1e054aba-94dc-4945-b4cd-0b7d01db39c4-kube-api-access-ldc7m\") pod \"certified-operators-pt7d5\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.395965 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-catalog-content\") pod \"certified-operators-pt7d5\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.396906 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.89688293 +0000 UTC m=+149.845180974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.397793 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.397873 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-utilities\") pod \"certified-operators-pt7d5\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.414311 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:49.914289297 +0000 UTC m=+149.862587341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.415337 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-utilities\") pod \"certified-operators-pt7d5\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.452599 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-catalog-content\") pod \"certified-operators-pt7d5\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.498646 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.500154 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.00013368 +0000 UTC m=+149.948431724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.525959 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-grstx"] Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.538039 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.575579 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqdqn\" (UniqueName: \"kubernetes.io/projected/3fcaee64-db78-46be-a54e-8412e4394681-kube-api-access-vqdqn\") pod \"community-operators-f2g2b\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.584856 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-grstx"] Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.604891 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.605333 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.105317924 +0000 UTC m=+150.053615968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.647363 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldc7m\" (UniqueName: \"kubernetes.io/projected/1e054aba-94dc-4945-b4cd-0b7d01db39c4-kube-api-access-ldc7m\") pod \"certified-operators-pt7d5\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.656467 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.705960 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.706132 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsqxd\" (UniqueName: \"kubernetes.io/projected/c21481e6-9cfa-46bf-a667-52b4a9b336d1-kube-api-access-bsqxd\") pod \"community-operators-grstx\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.706235 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-utilities\") pod \"community-operators-grstx\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.706256 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-catalog-content\") pod \"community-operators-grstx\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.706403 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.206385212 +0000 UTC m=+150.154683256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.812718 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsqxd\" (UniqueName: \"kubernetes.io/projected/c21481e6-9cfa-46bf-a667-52b4a9b336d1-kube-api-access-bsqxd\") pod \"community-operators-grstx\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.812830 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.812912 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-utilities\") pod \"community-operators-grstx\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.812929 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-catalog-content\") pod \"community-operators-grstx\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.815781 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.315756885 +0000 UTC m=+150.264055109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.818822 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-utilities\") pod \"community-operators-grstx\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.825541 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-catalog-content\") pod \"community-operators-grstx\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.872304 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsqxd\" (UniqueName: \"kubernetes.io/projected/c21481e6-9cfa-46bf-a667-52b4a9b336d1-kube-api-access-bsqxd\") pod \"community-operators-grstx\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.896416 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.914958 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:49 crc kubenswrapper[4780]: E0929 18:45:49.915530 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.415493818 +0000 UTC m=+150.363791862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:49 crc kubenswrapper[4780]: I0929 18:45:49.936928 4780 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.017018 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.017435 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.517419105 +0000 UTC m=+150.465717149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.028309 4780 patch_prober.go:28] interesting pod/router-default-5444994796-vgx7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 18:45:50 crc kubenswrapper[4780]: [-]has-synced failed: reason withheld Sep 29 18:45:50 crc kubenswrapper[4780]: [+]process-running ok Sep 29 18:45:50 crc kubenswrapper[4780]: healthz check failed Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.028376 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vgx7v" podUID="26a049a2-59dd-4762-ac58-9eb88fc892a4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.122314 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.122680 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.62265806 +0000 UTC m=+150.570956104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.138429 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-grstx" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.162783 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.179308 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.187381 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.187778 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.196262 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.224006 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.224868 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.724854356 +0000 UTC m=+150.673152400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.328885 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.329509 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46b3f97-035d-4966-b616-30e6f1f20d7a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e46b3f97-035d-4966-b616-30e6f1f20d7a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.329612 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.829573054 +0000 UTC m=+150.777871098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.329786 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46b3f97-035d-4966-b616-30e6f1f20d7a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e46b3f97-035d-4966-b616-30e6f1f20d7a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.329854 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.330266 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.830260187 +0000 UTC m=+150.778558231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.393696 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtrcx"] Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.404951 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-776sf" event={"ID":"0052df5f-706f-4dc9-b03e-dbd98d090fb3","Type":"ContainerStarted","Data":"ab8758ae8a82eb38b0ddbd50a70c40094b65c5ec01651101bf996092dc05dc28"} Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.434592 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.434796 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46b3f97-035d-4966-b616-30e6f1f20d7a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e46b3f97-035d-4966-b616-30e6f1f20d7a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.434843 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46b3f97-035d-4966-b616-30e6f1f20d7a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e46b3f97-035d-4966-b616-30e6f1f20d7a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.435363 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:50.935345468 +0000 UTC m=+150.883643502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.435409 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46b3f97-035d-4966-b616-30e6f1f20d7a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e46b3f97-035d-4966-b616-30e6f1f20d7a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.497039 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46b3f97-035d-4966-b616-30e6f1f20d7a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e46b3f97-035d-4966-b616-30e6f1f20d7a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.541119 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.547631 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:51.047596606 +0000 UTC m=+150.995894810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.563704 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.622513 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pt7d5"] Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.644574 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.644928 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:51.144893249 +0000 UTC m=+151.093191293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.682178 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f2g2b"] Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.715822 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hqdk7"] Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.734962 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.747666 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.794715 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:51.294691681 +0000 UTC m=+151.242989725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.800472 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.852816 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.853383 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:51.353340553 +0000 UTC m=+151.301638597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.853549 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-catalog-content\") pod \"redhat-marketplace-hqdk7\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.853629 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv7cp\" (UniqueName: \"kubernetes.io/projected/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-kube-api-access-hv7cp\") pod \"redhat-marketplace-hqdk7\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.853671 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-utilities\") pod \"redhat-marketplace-hqdk7\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.853751 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.854472 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-29 18:45:51.354464571 +0000 UTC m=+151.302762615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8c67f" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.899869 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqdk7"] Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.940072 4780 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-29T18:45:49.936961049Z","Handler":null,"Name":""} Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.954803 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.955233 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-catalog-content\") pod \"redhat-marketplace-hqdk7\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.955268 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv7cp\" (UniqueName: \"kubernetes.io/projected/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-kube-api-access-hv7cp\") pod \"redhat-marketplace-hqdk7\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.955292 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-utilities\") pod \"redhat-marketplace-hqdk7\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.955732 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-utilities\") pod \"redhat-marketplace-hqdk7\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:50 crc kubenswrapper[4780]: E0929 18:45:50.955826 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-29 18:45:51.455804957 +0000 UTC m=+151.404103001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.956083 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-catalog-content\") pod \"redhat-marketplace-hqdk7\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.957280 4780 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 29 18:45:50 crc kubenswrapper[4780]: I0929 18:45:50.957334 4780 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.048987 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv7cp\" (UniqueName: \"kubernetes.io/projected/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-kube-api-access-hv7cp\") pod \"redhat-marketplace-hqdk7\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.057039 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.069512 4780 patch_prober.go:28] interesting pod/router-default-5444994796-vgx7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 18:45:51 crc kubenswrapper[4780]: [-]has-synced failed: reason withheld Sep 29 18:45:51 crc kubenswrapper[4780]: [+]process-running ok Sep 29 18:45:51 crc kubenswrapper[4780]: healthz check failed Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.069582 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vgx7v" podUID="26a049a2-59dd-4762-ac58-9eb88fc892a4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.087077 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-grstx"] Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.116171 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.116612 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.127100 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zl69j"] Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.128894 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.148492 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.178722 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.195274 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zl69j"] Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.211958 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8c67f\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.252221 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.259981 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.260032 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/482b8d66-a8d0-4d21-ba06-6f818f092ea7-config-volume\") pod \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.260108 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxv98\" (UniqueName: \"kubernetes.io/projected/482b8d66-a8d0-4d21-ba06-6f818f092ea7-kube-api-access-cxv98\") pod \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.260151 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/482b8d66-a8d0-4d21-ba06-6f818f092ea7-secret-volume\") pod \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\" (UID: \"482b8d66-a8d0-4d21-ba06-6f818f092ea7\") " Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.260406 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-catalog-content\") pod \"redhat-marketplace-zl69j\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.260470 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-utilities\") pod \"redhat-marketplace-zl69j\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.260506 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vptp8\" (UniqueName: \"kubernetes.io/projected/8d660231-ddc9-400d-9a41-2395dfcbc3d7-kube-api-access-vptp8\") pod \"redhat-marketplace-zl69j\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.261857 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/482b8d66-a8d0-4d21-ba06-6f818f092ea7-config-volume" (OuterVolumeSpecName: "config-volume") pod "482b8d66-a8d0-4d21-ba06-6f818f092ea7" (UID: "482b8d66-a8d0-4d21-ba06-6f818f092ea7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.278618 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482b8d66-a8d0-4d21-ba06-6f818f092ea7-kube-api-access-cxv98" (OuterVolumeSpecName: "kube-api-access-cxv98") pod "482b8d66-a8d0-4d21-ba06-6f818f092ea7" (UID: "482b8d66-a8d0-4d21-ba06-6f818f092ea7"). InnerVolumeSpecName "kube-api-access-cxv98". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.278705 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482b8d66-a8d0-4d21-ba06-6f818f092ea7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "482b8d66-a8d0-4d21-ba06-6f818f092ea7" (UID: "482b8d66-a8d0-4d21-ba06-6f818f092ea7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.306417 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.369499 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-catalog-content\") pod \"redhat-marketplace-zl69j\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.370063 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-utilities\") pod \"redhat-marketplace-zl69j\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.370105 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vptp8\" (UniqueName: \"kubernetes.io/projected/8d660231-ddc9-400d-9a41-2395dfcbc3d7-kube-api-access-vptp8\") pod \"redhat-marketplace-zl69j\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.370182 4780 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/482b8d66-a8d0-4d21-ba06-6f818f092ea7-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.370196 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxv98\" (UniqueName: \"kubernetes.io/projected/482b8d66-a8d0-4d21-ba06-6f818f092ea7-kube-api-access-cxv98\") on node \"crc\" DevicePath \"\"" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.370223 4780 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/482b8d66-a8d0-4d21-ba06-6f818f092ea7-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.371156 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-catalog-content\") pod \"redhat-marketplace-zl69j\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.371482 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-utilities\") pod \"redhat-marketplace-zl69j\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.372827 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.404554 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vptp8\" (UniqueName: \"kubernetes.io/projected/8d660231-ddc9-400d-9a41-2395dfcbc3d7-kube-api-access-vptp8\") pod \"redhat-marketplace-zl69j\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.444619 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-776sf" event={"ID":"0052df5f-706f-4dc9-b03e-dbd98d090fb3","Type":"ContainerStarted","Data":"02b35300b17d3813666c0eb4171685b6ca4c38085566834badf4e44313fd11d5"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.496398 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grstx" event={"ID":"c21481e6-9cfa-46bf-a667-52b4a9b336d1","Type":"ContainerStarted","Data":"eb0bf073583a22cef5bfb30898d10bae5b31f1e7ac04875b6351bd27a33d8776"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.537920 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-776sf" podStartSLOduration=11.537886868 podStartE2EDuration="11.537886868s" podCreationTimestamp="2025-09-29 18:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:51.520150271 +0000 UTC m=+151.468448305" watchObservedRunningTime="2025-09-29 18:45:51.537886868 +0000 UTC m=+151.486184912" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.586392 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" event={"ID":"482b8d66-a8d0-4d21-ba06-6f818f092ea7","Type":"ContainerDied","Data":"f2ddfc7dd0fa443fadfdc6e9bb6891916bf6893c17e707ce5d8a2d7cb12576c9"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.591916 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2ddfc7dd0fa443fadfdc6e9bb6891916bf6893c17e707ce5d8a2d7cb12576c9" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.587986 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.619075 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.620340 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5df6b3d8897833d05993ce478e1a8e3747252fd1117427bf131c1ea2b9bae72d"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.620385 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4c99834f5f2cd108f8cab40aad807aab96de0ad18bd08733addbc70aeb37a1b6"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.622671 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.652119 4780 generic.go:334] "Generic (PLEG): container finished" podID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerID="cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e" exitCode=0 Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.652203 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtrcx" event={"ID":"4b71dbf4-1e39-4222-bfb1-ccec82699848","Type":"ContainerDied","Data":"cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.652254 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtrcx" event={"ID":"4b71dbf4-1e39-4222-bfb1-ccec82699848","Type":"ContainerStarted","Data":"1ccc6c55a0edad2d0a7e7e196bec54edb2ed767d90490d69d85f2d29a44a6a60"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.655665 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"abc63bec4dbbcc83613ee57a0b17c4b84c111217ebf9eb368df1e5cef2f6cf92"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.655706 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"14204d184a583c4059aada8217a6371e1883699bb3327beec508354a7356fbfd"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.656785 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.700206 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7adab0fa85fb0215770d6584f503993dc3ac4257f2a5b36e724efaa97376c6ec"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.700265 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8dd3a16c733a5d055667a603fb177feb10133d7038e29a3425b7b218e50489d0"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.709494 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqdk7"] Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.710171 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2g2b" event={"ID":"3fcaee64-db78-46be-a54e-8412e4394681","Type":"ContainerStarted","Data":"2fb79a636e33ad6294493a19abf9dd59665d338b5b2d09eba83ba423974fe894"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.742855 4780 generic.go:334] "Generic (PLEG): container finished" podID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerID="02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43" exitCode=0 Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.744374 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt7d5" event={"ID":"1e054aba-94dc-4945-b4cd-0b7d01db39c4","Type":"ContainerDied","Data":"02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.744407 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt7d5" event={"ID":"1e054aba-94dc-4945-b4cd-0b7d01db39c4","Type":"ContainerStarted","Data":"1b825aadb9b0186df276ec52345615991c96b0c1f31504f10a03f8629c312f52"} Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.805879 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8c67f"] Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.902338 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8bcsn"] Sep 29 18:45:51 crc kubenswrapper[4780]: E0929 18:45:51.902748 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482b8d66-a8d0-4d21-ba06-6f818f092ea7" containerName="collect-profiles" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.902767 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="482b8d66-a8d0-4d21-ba06-6f818f092ea7" containerName="collect-profiles" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.902898 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="482b8d66-a8d0-4d21-ba06-6f818f092ea7" containerName="collect-profiles" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.904073 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.908293 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.923419 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.932698 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.943154 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.946082 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.946199 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8bcsn"] Sep 29 18:45:51 crc kubenswrapper[4780]: I0929 18:45:51.974491 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.004652 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-catalog-content\") pod \"redhat-operators-8bcsn\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.004747 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmhmj\" (UniqueName: \"kubernetes.io/projected/725c40c0-e9f5-4caf-9aac-812bf777bf8b-kube-api-access-nmhmj\") pod \"redhat-operators-8bcsn\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.004866 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-utilities\") pod \"redhat-operators-8bcsn\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.004894 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d120ffc1-61fe-49e0-9775-e24e4356f900-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d120ffc1-61fe-49e0-9775-e24e4356f900\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.004940 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d120ffc1-61fe-49e0-9775-e24e4356f900-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d120ffc1-61fe-49e0-9775-e24e4356f900\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.018366 4780 patch_prober.go:28] interesting pod/router-default-5444994796-vgx7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 18:45:52 crc kubenswrapper[4780]: [-]has-synced failed: reason withheld Sep 29 18:45:52 crc kubenswrapper[4780]: [+]process-running ok Sep 29 18:45:52 crc kubenswrapper[4780]: healthz check failed Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.018440 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vgx7v" podUID="26a049a2-59dd-4762-ac58-9eb88fc892a4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.106063 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-utilities\") pod \"redhat-operators-8bcsn\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.106130 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d120ffc1-61fe-49e0-9775-e24e4356f900-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d120ffc1-61fe-49e0-9775-e24e4356f900\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.106197 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d120ffc1-61fe-49e0-9775-e24e4356f900-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d120ffc1-61fe-49e0-9775-e24e4356f900\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.106227 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-catalog-content\") pod \"redhat-operators-8bcsn\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.106268 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmhmj\" (UniqueName: \"kubernetes.io/projected/725c40c0-e9f5-4caf-9aac-812bf777bf8b-kube-api-access-nmhmj\") pod \"redhat-operators-8bcsn\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.107209 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-catalog-content\") pod \"redhat-operators-8bcsn\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.107292 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-utilities\") pod \"redhat-operators-8bcsn\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.107369 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d120ffc1-61fe-49e0-9775-e24e4356f900-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d120ffc1-61fe-49e0-9775-e24e4356f900\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.134084 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmhmj\" (UniqueName: \"kubernetes.io/projected/725c40c0-e9f5-4caf-9aac-812bf777bf8b-kube-api-access-nmhmj\") pod \"redhat-operators-8bcsn\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.134094 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d120ffc1-61fe-49e0-9775-e24e4356f900-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d120ffc1-61fe-49e0-9775-e24e4356f900\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.134380 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.142495 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gj4p8" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.244376 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.287389 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.304244 4780 patch_prober.go:28] interesting pod/downloads-7954f5f757-7v67w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.304317 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7v67w" podUID="d5a6b98f-17b6-4e3c-aa64-9b05b9d23547" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.304600 4780 patch_prober.go:28] interesting pod/downloads-7954f5f757-7v67w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.304664 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7v67w" podUID="d5a6b98f-17b6-4e3c-aa64-9b05b9d23547" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.311029 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hms9w"] Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.312773 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.317422 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zl69j"] Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.328938 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hms9w"] Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.413920 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-utilities\") pod \"redhat-operators-hms9w\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.414496 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fbvb\" (UniqueName: \"kubernetes.io/projected/5ccd51ee-2519-477d-bb0f-182d3837fa0f-kube-api-access-6fbvb\") pod \"redhat-operators-hms9w\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.414530 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-catalog-content\") pod \"redhat-operators-hms9w\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.516786 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-utilities\") pod \"redhat-operators-hms9w\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.517573 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-utilities\") pod \"redhat-operators-hms9w\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.517671 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fbvb\" (UniqueName: \"kubernetes.io/projected/5ccd51ee-2519-477d-bb0f-182d3837fa0f-kube-api-access-6fbvb\") pod \"redhat-operators-hms9w\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.517710 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-catalog-content\") pod \"redhat-operators-hms9w\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.520265 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-catalog-content\") pod \"redhat-operators-hms9w\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.540207 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fbvb\" (UniqueName: \"kubernetes.io/projected/5ccd51ee-2519-477d-bb0f-182d3837fa0f-kube-api-access-6fbvb\") pod \"redhat-operators-hms9w\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.595961 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8bcsn"] Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.640649 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.674941 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.787565 4780 generic.go:334] "Generic (PLEG): container finished" podID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerID="76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e" exitCode=0 Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.789479 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.790506 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bcsn" event={"ID":"725c40c0-e9f5-4caf-9aac-812bf777bf8b","Type":"ContainerStarted","Data":"a4016787c14e0f90a5c1b349dfba7f1e46b16fc1599a3abca50417612ea068b8"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.790548 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zl69j" event={"ID":"8d660231-ddc9-400d-9a41-2395dfcbc3d7","Type":"ContainerDied","Data":"76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.790567 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zl69j" event={"ID":"8d660231-ddc9-400d-9a41-2395dfcbc3d7","Type":"ContainerStarted","Data":"b7bc311b4957d4364ae618a8067170f48bcf0b18ecd7b4da1ec1984f8e43cc47"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.794251 4780 generic.go:334] "Generic (PLEG): container finished" podID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerID="68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339" exitCode=0 Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.794905 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqdk7" event={"ID":"9352c838-865a-4ef4-ae6f-e6b49ef46fa2","Type":"ContainerDied","Data":"68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.794930 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqdk7" event={"ID":"9352c838-865a-4ef4-ae6f-e6b49ef46fa2","Type":"ContainerStarted","Data":"9885863f9a370fca9b5af6395707d7152dd1822b6d6a3026fff77ff450a3e3dc"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.802669 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d120ffc1-61fe-49e0-9775-e24e4356f900","Type":"ContainerStarted","Data":"f4b6b7c9d70573c18796e3f94941f9db9eb54f1379ee7176849257aa39416b2b"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.810950 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2g2b" event={"ID":"3fcaee64-db78-46be-a54e-8412e4394681","Type":"ContainerDied","Data":"bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.810840 4780 generic.go:334] "Generic (PLEG): container finished" podID="3fcaee64-db78-46be-a54e-8412e4394681" containerID="bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6" exitCode=0 Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.817690 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e46b3f97-035d-4966-b616-30e6f1f20d7a","Type":"ContainerStarted","Data":"142896dc6b7ab0f634e6a63785bc7ddc05074849f1737922914a942fb5ecca32"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.817733 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e46b3f97-035d-4966-b616-30e6f1f20d7a","Type":"ContainerStarted","Data":"76848a92e530e27e62b360a25de3834079114cba855543a0f32c927105492f2c"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.821109 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" event={"ID":"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6","Type":"ContainerStarted","Data":"4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.821142 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" event={"ID":"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6","Type":"ContainerStarted","Data":"3b1c281ed7acf7ccf0534cd9a07bbbd6d9000f84f183ee71678f75110294d37b"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.821559 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.845328 4780 generic.go:334] "Generic (PLEG): container finished" podID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerID="cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0" exitCode=0 Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.846252 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grstx" event={"ID":"c21481e6-9cfa-46bf-a667-52b4a9b336d1","Type":"ContainerDied","Data":"cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0"} Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.877878 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" podStartSLOduration=130.877852522 podStartE2EDuration="2m10.877852522s" podCreationTimestamp="2025-09-29 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:52.875527765 +0000 UTC m=+152.823825809" watchObservedRunningTime="2025-09-29 18:45:52.877852522 +0000 UTC m=+152.826150566" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.929633 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:45:52 crc kubenswrapper[4780]: I0929 18:45:52.960645 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.960624464 podStartE2EDuration="2.960624464s" podCreationTimestamp="2025-09-29 18:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:52.918433296 +0000 UTC m=+152.866731340" watchObservedRunningTime="2025-09-29 18:45:52.960624464 +0000 UTC m=+152.908922508" Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.020190 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.023097 4780 patch_prober.go:28] interesting pod/router-default-5444994796-vgx7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 29 18:45:53 crc kubenswrapper[4780]: [-]has-synced failed: reason withheld Sep 29 18:45:53 crc kubenswrapper[4780]: [+]process-running ok Sep 29 18:45:53 crc kubenswrapper[4780]: healthz check failed Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.023144 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vgx7v" podUID="26a049a2-59dd-4762-ac58-9eb88fc892a4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.201640 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.201705 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.213885 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-t7zv9" Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.215425 4780 patch_prober.go:28] interesting pod/console-f9d7485db-77bv2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.215514 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-77bv2" podUID="fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.326341 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hms9w"] Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.435868 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.854595 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d120ffc1-61fe-49e0-9775-e24e4356f900","Type":"ContainerStarted","Data":"1343886883d69cd146345fd044d5019defc64e28b5ed6aa65dd839b8c184124c"} Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.858351 4780 generic.go:334] "Generic (PLEG): container finished" podID="e46b3f97-035d-4966-b616-30e6f1f20d7a" containerID="142896dc6b7ab0f634e6a63785bc7ddc05074849f1737922914a942fb5ecca32" exitCode=0 Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.858502 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e46b3f97-035d-4966-b616-30e6f1f20d7a","Type":"ContainerDied","Data":"142896dc6b7ab0f634e6a63785bc7ddc05074849f1737922914a942fb5ecca32"} Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.875325 4780 generic.go:334] "Generic (PLEG): container finished" podID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerID="911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d" exitCode=0 Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.875475 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bcsn" event={"ID":"725c40c0-e9f5-4caf-9aac-812bf777bf8b","Type":"ContainerDied","Data":"911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d"} Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.885400 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.885377464 podStartE2EDuration="2.885377464s" podCreationTimestamp="2025-09-29 18:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:45:53.882624363 +0000 UTC m=+153.830922407" watchObservedRunningTime="2025-09-29 18:45:53.885377464 +0000 UTC m=+153.833675518" Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.895624 4780 generic.go:334] "Generic (PLEG): container finished" podID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerID="5ab1cab88eebe4b603131db030dd3586441c8a4a22f8e8bc0be227f9a37410f6" exitCode=0 Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.896832 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hms9w" event={"ID":"5ccd51ee-2519-477d-bb0f-182d3837fa0f","Type":"ContainerDied","Data":"5ab1cab88eebe4b603131db030dd3586441c8a4a22f8e8bc0be227f9a37410f6"} Sep 29 18:45:53 crc kubenswrapper[4780]: I0929 18:45:53.896856 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hms9w" event={"ID":"5ccd51ee-2519-477d-bb0f-182d3837fa0f","Type":"ContainerStarted","Data":"226a74f85799c1ad2e0fd378fb8325eda45dac8505ed974c3bf98512ad8f0ac2"} Sep 29 18:45:54 crc kubenswrapper[4780]: I0929 18:45:54.016626 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:54 crc kubenswrapper[4780]: I0929 18:45:54.019827 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vgx7v" Sep 29 18:45:54 crc kubenswrapper[4780]: I0929 18:45:54.917836 4780 generic.go:334] "Generic (PLEG): container finished" podID="d120ffc1-61fe-49e0-9775-e24e4356f900" containerID="1343886883d69cd146345fd044d5019defc64e28b5ed6aa65dd839b8c184124c" exitCode=0 Sep 29 18:45:54 crc kubenswrapper[4780]: I0929 18:45:54.919063 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d120ffc1-61fe-49e0-9775-e24e4356f900","Type":"ContainerDied","Data":"1343886883d69cd146345fd044d5019defc64e28b5ed6aa65dd839b8c184124c"} Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.299704 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.490567 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46b3f97-035d-4966-b616-30e6f1f20d7a-kube-api-access\") pod \"e46b3f97-035d-4966-b616-30e6f1f20d7a\" (UID: \"e46b3f97-035d-4966-b616-30e6f1f20d7a\") " Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.490735 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46b3f97-035d-4966-b616-30e6f1f20d7a-kubelet-dir\") pod \"e46b3f97-035d-4966-b616-30e6f1f20d7a\" (UID: \"e46b3f97-035d-4966-b616-30e6f1f20d7a\") " Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.490826 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e46b3f97-035d-4966-b616-30e6f1f20d7a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e46b3f97-035d-4966-b616-30e6f1f20d7a" (UID: "e46b3f97-035d-4966-b616-30e6f1f20d7a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.491260 4780 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46b3f97-035d-4966-b616-30e6f1f20d7a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.511528 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46b3f97-035d-4966-b616-30e6f1f20d7a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e46b3f97-035d-4966-b616-30e6f1f20d7a" (UID: "e46b3f97-035d-4966-b616-30e6f1f20d7a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.599104 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46b3f97-035d-4966-b616-30e6f1f20d7a-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.947186 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e46b3f97-035d-4966-b616-30e6f1f20d7a","Type":"ContainerDied","Data":"76848a92e530e27e62b360a25de3834079114cba855543a0f32c927105492f2c"} Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.947222 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 29 18:45:55 crc kubenswrapper[4780]: I0929 18:45:55.947257 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76848a92e530e27e62b360a25de3834079114cba855543a0f32c927105492f2c" Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.504960 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.524175 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d120ffc1-61fe-49e0-9775-e24e4356f900-kube-api-access\") pod \"d120ffc1-61fe-49e0-9775-e24e4356f900\" (UID: \"d120ffc1-61fe-49e0-9775-e24e4356f900\") " Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.524711 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d120ffc1-61fe-49e0-9775-e24e4356f900-kubelet-dir\") pod \"d120ffc1-61fe-49e0-9775-e24e4356f900\" (UID: \"d120ffc1-61fe-49e0-9775-e24e4356f900\") " Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.527923 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d120ffc1-61fe-49e0-9775-e24e4356f900-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d120ffc1-61fe-49e0-9775-e24e4356f900" (UID: "d120ffc1-61fe-49e0-9775-e24e4356f900"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.529791 4780 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d120ffc1-61fe-49e0-9775-e24e4356f900-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.554582 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d120ffc1-61fe-49e0-9775-e24e4356f900-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d120ffc1-61fe-49e0-9775-e24e4356f900" (UID: "d120ffc1-61fe-49e0-9775-e24e4356f900"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.642205 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d120ffc1-61fe-49e0-9775-e24e4356f900-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.971172 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d120ffc1-61fe-49e0-9775-e24e4356f900","Type":"ContainerDied","Data":"f4b6b7c9d70573c18796e3f94941f9db9eb54f1379ee7176849257aa39416b2b"} Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.971225 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b6b7c9d70573c18796e3f94941f9db9eb54f1379ee7176849257aa39416b2b" Sep 29 18:45:56 crc kubenswrapper[4780]: I0929 18:45:56.971246 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 29 18:45:58 crc kubenswrapper[4780]: I0929 18:45:58.079628 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tcnj7" Sep 29 18:46:02 crc kubenswrapper[4780]: I0929 18:46:02.319815 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7v67w" Sep 29 18:46:03 crc kubenswrapper[4780]: I0929 18:46:03.211625 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:46:03 crc kubenswrapper[4780]: I0929 18:46:03.216037 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:46:03 crc kubenswrapper[4780]: I0929 18:46:03.223478 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:46:03 crc kubenswrapper[4780]: I0929 18:46:03.223628 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:46:05 crc kubenswrapper[4780]: I0929 18:46:05.011012 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:46:05 crc kubenswrapper[4780]: I0929 18:46:05.018578 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7b75391-2034-4284-b779-eb7b1e9da774-metrics-certs\") pod \"network-metrics-daemon-j6vxr\" (UID: \"f7b75391-2034-4284-b779-eb7b1e9da774\") " pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:46:05 crc kubenswrapper[4780]: I0929 18:46:05.073645 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6vxr" Sep 29 18:46:11 crc kubenswrapper[4780]: I0929 18:46:11.262004 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:46:20 crc kubenswrapper[4780]: I0929 18:46:20.558611 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j6vxr"] Sep 29 18:46:20 crc kubenswrapper[4780]: W0929 18:46:20.588903 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b75391_2034_4284_b779_eb7b1e9da774.slice/crio-30c0ff4bcd48f99acc5e360c09a5008d5ff7390e5de7072fd21254f8e3fcaae3 WatchSource:0}: Error finding container 30c0ff4bcd48f99acc5e360c09a5008d5ff7390e5de7072fd21254f8e3fcaae3: Status 404 returned error can't find the container with id 30c0ff4bcd48f99acc5e360c09a5008d5ff7390e5de7072fd21254f8e3fcaae3 Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.171165 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" event={"ID":"f7b75391-2034-4284-b779-eb7b1e9da774","Type":"ContainerStarted","Data":"30c0ff4bcd48f99acc5e360c09a5008d5ff7390e5de7072fd21254f8e3fcaae3"} Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.175277 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtrcx" event={"ID":"4b71dbf4-1e39-4222-bfb1-ccec82699848","Type":"ContainerDied","Data":"eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0"} Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.175431 4780 generic.go:334] "Generic (PLEG): container finished" podID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerID="eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0" exitCode=0 Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.182451 4780 generic.go:334] "Generic (PLEG): container finished" podID="3fcaee64-db78-46be-a54e-8412e4394681" containerID="5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026" exitCode=0 Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.182535 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2g2b" event={"ID":"3fcaee64-db78-46be-a54e-8412e4394681","Type":"ContainerDied","Data":"5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026"} Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.189363 4780 generic.go:334] "Generic (PLEG): container finished" podID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerID="b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f" exitCode=0 Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.189470 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grstx" event={"ID":"c21481e6-9cfa-46bf-a667-52b4a9b336d1","Type":"ContainerDied","Data":"b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f"} Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.206948 4780 generic.go:334] "Generic (PLEG): container finished" podID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerID="28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a" exitCode=0 Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.207104 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zl69j" event={"ID":"8d660231-ddc9-400d-9a41-2395dfcbc3d7","Type":"ContainerDied","Data":"28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a"} Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.210840 4780 generic.go:334] "Generic (PLEG): container finished" podID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerID="1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1" exitCode=0 Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.210956 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt7d5" event={"ID":"1e054aba-94dc-4945-b4cd-0b7d01db39c4","Type":"ContainerDied","Data":"1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1"} Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.215290 4780 generic.go:334] "Generic (PLEG): container finished" podID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerID="d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda" exitCode=0 Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.215340 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bcsn" event={"ID":"725c40c0-e9f5-4caf-9aac-812bf777bf8b","Type":"ContainerDied","Data":"d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda"} Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.221185 4780 generic.go:334] "Generic (PLEG): container finished" podID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerID="b03195390c48c298ed73bd149e953fb45fa989483f24c3b5f8c07a54d919b1bd" exitCode=0 Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.221659 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hms9w" event={"ID":"5ccd51ee-2519-477d-bb0f-182d3837fa0f","Type":"ContainerDied","Data":"b03195390c48c298ed73bd149e953fb45fa989483f24c3b5f8c07a54d919b1bd"} Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.225807 4780 generic.go:334] "Generic (PLEG): container finished" podID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerID="e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6" exitCode=0 Sep 29 18:46:21 crc kubenswrapper[4780]: I0929 18:46:21.225900 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqdk7" event={"ID":"9352c838-865a-4ef4-ae6f-e6b49ef46fa2","Type":"ContainerDied","Data":"e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6"} Sep 29 18:46:22 crc kubenswrapper[4780]: I0929 18:46:22.237991 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" event={"ID":"f7b75391-2034-4284-b779-eb7b1e9da774","Type":"ContainerStarted","Data":"644aff2de0de2ae0b6beab662f98b7c52e61071989790d4bb9736a7b7c0278bd"} Sep 29 18:46:22 crc kubenswrapper[4780]: I0929 18:46:22.238438 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6vxr" event={"ID":"f7b75391-2034-4284-b779-eb7b1e9da774","Type":"ContainerStarted","Data":"304785f7bd3be0a35bdddbd60078ceadf46af2b0a903474987fdc8faca4d1fe6"} Sep 29 18:46:22 crc kubenswrapper[4780]: I0929 18:46:22.275095 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j6vxr" podStartSLOduration=161.275026559 podStartE2EDuration="2m41.275026559s" podCreationTimestamp="2025-09-29 18:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:46:22.257690065 +0000 UTC m=+182.205988109" watchObservedRunningTime="2025-09-29 18:46:22.275026559 +0000 UTC m=+182.223324643" Sep 29 18:46:23 crc kubenswrapper[4780]: I0929 18:46:23.006084 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tjr5n" Sep 29 18:46:23 crc kubenswrapper[4780]: I0929 18:46:23.248256 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtrcx" event={"ID":"4b71dbf4-1e39-4222-bfb1-ccec82699848","Type":"ContainerStarted","Data":"1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7"} Sep 29 18:46:24 crc kubenswrapper[4780]: I0929 18:46:24.272332 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qtrcx" podStartSLOduration=5.139889231 podStartE2EDuration="36.272308055s" podCreationTimestamp="2025-09-29 18:45:48 +0000 UTC" firstStartedPulling="2025-09-29 18:45:51.656377573 +0000 UTC m=+151.604675617" lastFinishedPulling="2025-09-29 18:46:22.788796387 +0000 UTC m=+182.737094441" observedRunningTime="2025-09-29 18:46:24.269207652 +0000 UTC m=+184.217505696" watchObservedRunningTime="2025-09-29 18:46:24.272308055 +0000 UTC m=+184.220606099" Sep 29 18:46:25 crc kubenswrapper[4780]: I0929 18:46:25.259754 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zl69j" event={"ID":"8d660231-ddc9-400d-9a41-2395dfcbc3d7","Type":"ContainerStarted","Data":"d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414"} Sep 29 18:46:25 crc kubenswrapper[4780]: I0929 18:46:25.262747 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqdk7" event={"ID":"9352c838-865a-4ef4-ae6f-e6b49ef46fa2","Type":"ContainerStarted","Data":"f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4"} Sep 29 18:46:25 crc kubenswrapper[4780]: I0929 18:46:25.265073 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bcsn" event={"ID":"725c40c0-e9f5-4caf-9aac-812bf777bf8b","Type":"ContainerStarted","Data":"c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a"} Sep 29 18:46:25 crc kubenswrapper[4780]: I0929 18:46:25.321313 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8bcsn" podStartSLOduration=3.241737208 podStartE2EDuration="34.321290281s" podCreationTimestamp="2025-09-29 18:45:51 +0000 UTC" firstStartedPulling="2025-09-29 18:45:53.9142318 +0000 UTC m=+153.862529844" lastFinishedPulling="2025-09-29 18:46:24.993784873 +0000 UTC m=+184.942082917" observedRunningTime="2025-09-29 18:46:25.31792813 +0000 UTC m=+185.266226174" watchObservedRunningTime="2025-09-29 18:46:25.321290281 +0000 UTC m=+185.269588325" Sep 29 18:46:25 crc kubenswrapper[4780]: I0929 18:46:25.322491 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zl69j" podStartSLOduration=2.5339672049999997 podStartE2EDuration="34.322484071s" podCreationTimestamp="2025-09-29 18:45:51 +0000 UTC" firstStartedPulling="2025-09-29 18:45:52.792779054 +0000 UTC m=+152.741077098" lastFinishedPulling="2025-09-29 18:46:24.58129592 +0000 UTC m=+184.529593964" observedRunningTime="2025-09-29 18:46:25.295455496 +0000 UTC m=+185.243753540" watchObservedRunningTime="2025-09-29 18:46:25.322484071 +0000 UTC m=+185.270782115" Sep 29 18:46:25 crc kubenswrapper[4780]: I0929 18:46:25.348459 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hqdk7" podStartSLOduration=3.120145258 podStartE2EDuration="35.348432651s" podCreationTimestamp="2025-09-29 18:45:50 +0000 UTC" firstStartedPulling="2025-09-29 18:45:52.799877359 +0000 UTC m=+152.748175403" lastFinishedPulling="2025-09-29 18:46:25.028164752 +0000 UTC m=+184.976462796" observedRunningTime="2025-09-29 18:46:25.342336359 +0000 UTC m=+185.290634403" watchObservedRunningTime="2025-09-29 18:46:25.348432651 +0000 UTC m=+185.296730695" Sep 29 18:46:26 crc kubenswrapper[4780]: I0929 18:46:26.273016 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grstx" event={"ID":"c21481e6-9cfa-46bf-a667-52b4a9b336d1","Type":"ContainerStarted","Data":"660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f"} Sep 29 18:46:26 crc kubenswrapper[4780]: I0929 18:46:26.276172 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hms9w" event={"ID":"5ccd51ee-2519-477d-bb0f-182d3837fa0f","Type":"ContainerStarted","Data":"0e898f4a8b37b27ba32d97f392ed453e210649460464fdac481669e4d2c53317"} Sep 29 18:46:26 crc kubenswrapper[4780]: I0929 18:46:26.278376 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt7d5" event={"ID":"1e054aba-94dc-4945-b4cd-0b7d01db39c4","Type":"ContainerStarted","Data":"8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6"} Sep 29 18:46:26 crc kubenswrapper[4780]: I0929 18:46:26.281807 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2g2b" event={"ID":"3fcaee64-db78-46be-a54e-8412e4394681","Type":"ContainerStarted","Data":"7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9"} Sep 29 18:46:26 crc kubenswrapper[4780]: I0929 18:46:26.323115 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-grstx" podStartSLOduration=5.032036883 podStartE2EDuration="37.323089515s" podCreationTimestamp="2025-09-29 18:45:49 +0000 UTC" firstStartedPulling="2025-09-29 18:45:52.85878515 +0000 UTC m=+152.807083194" lastFinishedPulling="2025-09-29 18:46:25.149837782 +0000 UTC m=+185.098135826" observedRunningTime="2025-09-29 18:46:26.299271746 +0000 UTC m=+186.247569790" watchObservedRunningTime="2025-09-29 18:46:26.323089515 +0000 UTC m=+186.271387559" Sep 29 18:46:26 crc kubenswrapper[4780]: I0929 18:46:26.324326 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pt7d5" podStartSLOduration=3.725174216 podStartE2EDuration="37.324320476s" podCreationTimestamp="2025-09-29 18:45:49 +0000 UTC" firstStartedPulling="2025-09-29 18:45:51.747269534 +0000 UTC m=+151.695567578" lastFinishedPulling="2025-09-29 18:46:25.346415794 +0000 UTC m=+185.294713838" observedRunningTime="2025-09-29 18:46:26.321478541 +0000 UTC m=+186.269776585" watchObservedRunningTime="2025-09-29 18:46:26.324320476 +0000 UTC m=+186.272618520" Sep 29 18:46:26 crc kubenswrapper[4780]: I0929 18:46:26.344684 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hms9w" podStartSLOduration=2.957926571 podStartE2EDuration="34.344664289s" podCreationTimestamp="2025-09-29 18:45:52 +0000 UTC" firstStartedPulling="2025-09-29 18:45:53.913837397 +0000 UTC m=+153.862135441" lastFinishedPulling="2025-09-29 18:46:25.300575115 +0000 UTC m=+185.248873159" observedRunningTime="2025-09-29 18:46:26.343588274 +0000 UTC m=+186.291886318" watchObservedRunningTime="2025-09-29 18:46:26.344664289 +0000 UTC m=+186.292962333" Sep 29 18:46:26 crc kubenswrapper[4780]: I0929 18:46:26.362908 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f2g2b" podStartSLOduration=4.948969558 podStartE2EDuration="38.362891713s" podCreationTimestamp="2025-09-29 18:45:48 +0000 UTC" firstStartedPulling="2025-09-29 18:45:51.722320067 +0000 UTC m=+151.670618111" lastFinishedPulling="2025-09-29 18:46:25.136242212 +0000 UTC m=+185.084540266" observedRunningTime="2025-09-29 18:46:26.360887497 +0000 UTC m=+186.309185541" watchObservedRunningTime="2025-09-29 18:46:26.362891713 +0000 UTC m=+186.311189757" Sep 29 18:46:28 crc kubenswrapper[4780]: I0929 18:46:28.778715 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.087703 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.088072 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.478809 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.523609 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.657581 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.657675 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.700213 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.897756 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.897831 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:46:29 crc kubenswrapper[4780]: I0929 18:46:29.948072 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:46:30 crc kubenswrapper[4780]: I0929 18:46:30.140282 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-grstx" Sep 29 18:46:30 crc kubenswrapper[4780]: I0929 18:46:30.140354 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-grstx" Sep 29 18:46:30 crc kubenswrapper[4780]: I0929 18:46:30.192644 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-grstx" Sep 29 18:46:30 crc kubenswrapper[4780]: I0929 18:46:30.350027 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-grstx" Sep 29 18:46:30 crc kubenswrapper[4780]: I0929 18:46:30.352158 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:46:30 crc kubenswrapper[4780]: I0929 18:46:30.364617 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:46:31 crc kubenswrapper[4780]: I0929 18:46:31.180313 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:46:31 crc kubenswrapper[4780]: I0929 18:46:31.180713 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:46:31 crc kubenswrapper[4780]: I0929 18:46:31.229800 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:46:31 crc kubenswrapper[4780]: I0929 18:46:31.326590 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-grstx"] Sep 29 18:46:31 crc kubenswrapper[4780]: I0929 18:46:31.371806 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:46:31 crc kubenswrapper[4780]: I0929 18:46:31.620779 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:46:31 crc kubenswrapper[4780]: I0929 18:46:31.620873 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:46:31 crc kubenswrapper[4780]: I0929 18:46:31.678484 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:46:32 crc kubenswrapper[4780]: I0929 18:46:32.246182 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:46:32 crc kubenswrapper[4780]: I0929 18:46:32.246248 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:46:32 crc kubenswrapper[4780]: I0929 18:46:32.298005 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:46:32 crc kubenswrapper[4780]: I0929 18:46:32.331035 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-grstx" podUID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerName="registry-server" containerID="cri-o://660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f" gracePeriod=2 Sep 29 18:46:32 crc kubenswrapper[4780]: I0929 18:46:32.380005 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:46:32 crc kubenswrapper[4780]: I0929 18:46:32.384543 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:46:32 crc kubenswrapper[4780]: I0929 18:46:32.641471 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:46:32 crc kubenswrapper[4780]: I0929 18:46:32.641877 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:46:32 crc kubenswrapper[4780]: I0929 18:46:32.688972 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.223408 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.223494 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.271353 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-grstx" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.352343 4780 generic.go:334] "Generic (PLEG): container finished" podID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerID="660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f" exitCode=0 Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.352342 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-grstx" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.352469 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grstx" event={"ID":"c21481e6-9cfa-46bf-a667-52b4a9b336d1","Type":"ContainerDied","Data":"660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f"} Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.352922 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-grstx" event={"ID":"c21481e6-9cfa-46bf-a667-52b4a9b336d1","Type":"ContainerDied","Data":"eb0bf073583a22cef5bfb30898d10bae5b31f1e7ac04875b6351bd27a33d8776"} Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.352953 4780 scope.go:117] "RemoveContainer" containerID="660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.383355 4780 scope.go:117] "RemoveContainer" containerID="b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.404098 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.416197 4780 scope.go:117] "RemoveContainer" containerID="cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.439236 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-catalog-content\") pod \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.439441 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-utilities\") pod \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.439496 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsqxd\" (UniqueName: \"kubernetes.io/projected/c21481e6-9cfa-46bf-a667-52b4a9b336d1-kube-api-access-bsqxd\") pod \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\" (UID: \"c21481e6-9cfa-46bf-a667-52b4a9b336d1\") " Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.440409 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-utilities" (OuterVolumeSpecName: "utilities") pod "c21481e6-9cfa-46bf-a667-52b4a9b336d1" (UID: "c21481e6-9cfa-46bf-a667-52b4a9b336d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.444983 4780 scope.go:117] "RemoveContainer" containerID="660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f" Sep 29 18:46:33 crc kubenswrapper[4780]: E0929 18:46:33.445730 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f\": container with ID starting with 660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f not found: ID does not exist" containerID="660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.445801 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f"} err="failed to get container status \"660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f\": rpc error: code = NotFound desc = could not find container \"660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f\": container with ID starting with 660776d876faadf9b3bedb262497dc3e6d01586e83eaaa50abc08bae34d7c17f not found: ID does not exist" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.445883 4780 scope.go:117] "RemoveContainer" containerID="b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f" Sep 29 18:46:33 crc kubenswrapper[4780]: E0929 18:46:33.446369 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f\": container with ID starting with b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f not found: ID does not exist" containerID="b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.446405 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f"} err="failed to get container status \"b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f\": rpc error: code = NotFound desc = could not find container \"b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f\": container with ID starting with b2c3a37566fd7774e350f6dac70a36f563d887a4ead65033a981406d4c36389f not found: ID does not exist" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.446424 4780 scope.go:117] "RemoveContainer" containerID="cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0" Sep 29 18:46:33 crc kubenswrapper[4780]: E0929 18:46:33.447651 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0\": container with ID starting with cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0 not found: ID does not exist" containerID="cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.447681 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0"} err="failed to get container status \"cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0\": rpc error: code = NotFound desc = could not find container \"cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0\": container with ID starting with cfe5aafa87d992aff908e0d28b3622074ecad002e9059643c42228ccaa846ae0 not found: ID does not exist" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.448299 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21481e6-9cfa-46bf-a667-52b4a9b336d1-kube-api-access-bsqxd" (OuterVolumeSpecName: "kube-api-access-bsqxd") pod "c21481e6-9cfa-46bf-a667-52b4a9b336d1" (UID: "c21481e6-9cfa-46bf-a667-52b4a9b336d1"). InnerVolumeSpecName "kube-api-access-bsqxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.494659 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c21481e6-9cfa-46bf-a667-52b4a9b336d1" (UID: "c21481e6-9cfa-46bf-a667-52b4a9b336d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.528428 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pt7d5"] Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.528697 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pt7d5" podUID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerName="registry-server" containerID="cri-o://8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6" gracePeriod=2 Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.541698 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.541759 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsqxd\" (UniqueName: \"kubernetes.io/projected/c21481e6-9cfa-46bf-a667-52b4a9b336d1-kube-api-access-bsqxd\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.541772 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21481e6-9cfa-46bf-a667-52b4a9b336d1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.694253 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-grstx"] Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.701718 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-grstx"] Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.728445 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zl69j"] Sep 29 18:46:33 crc kubenswrapper[4780]: I0929 18:46:33.912336 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.049288 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldc7m\" (UniqueName: \"kubernetes.io/projected/1e054aba-94dc-4945-b4cd-0b7d01db39c4-kube-api-access-ldc7m\") pod \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.049862 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-catalog-content\") pod \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.049980 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-utilities\") pod \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\" (UID: \"1e054aba-94dc-4945-b4cd-0b7d01db39c4\") " Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.050990 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-utilities" (OuterVolumeSpecName: "utilities") pod "1e054aba-94dc-4945-b4cd-0b7d01db39c4" (UID: "1e054aba-94dc-4945-b4cd-0b7d01db39c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.060316 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e054aba-94dc-4945-b4cd-0b7d01db39c4-kube-api-access-ldc7m" (OuterVolumeSpecName: "kube-api-access-ldc7m") pod "1e054aba-94dc-4945-b4cd-0b7d01db39c4" (UID: "1e054aba-94dc-4945-b4cd-0b7d01db39c4"). InnerVolumeSpecName "kube-api-access-ldc7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.096295 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e054aba-94dc-4945-b4cd-0b7d01db39c4" (UID: "1e054aba-94dc-4945-b4cd-0b7d01db39c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.152029 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldc7m\" (UniqueName: \"kubernetes.io/projected/1e054aba-94dc-4945-b4cd-0b7d01db39c4-kube-api-access-ldc7m\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.152097 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.152109 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e054aba-94dc-4945-b4cd-0b7d01db39c4-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.365197 4780 generic.go:334] "Generic (PLEG): container finished" podID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerID="8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6" exitCode=0 Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.369178 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt7d5" event={"ID":"1e054aba-94dc-4945-b4cd-0b7d01db39c4","Type":"ContainerDied","Data":"8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6"} Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.369245 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt7d5" event={"ID":"1e054aba-94dc-4945-b4cd-0b7d01db39c4","Type":"ContainerDied","Data":"1b825aadb9b0186df276ec52345615991c96b0c1f31504f10a03f8629c312f52"} Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.369275 4780 scope.go:117] "RemoveContainer" containerID="8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.369483 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt7d5" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.370764 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zl69j" podUID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerName="registry-server" containerID="cri-o://d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414" gracePeriod=2 Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.410691 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pt7d5"] Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.416494 4780 scope.go:117] "RemoveContainer" containerID="1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.426543 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pt7d5"] Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.484110 4780 scope.go:117] "RemoveContainer" containerID="02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.519521 4780 scope.go:117] "RemoveContainer" containerID="8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6" Sep 29 18:46:34 crc kubenswrapper[4780]: E0929 18:46:34.520019 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6\": container with ID starting with 8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6 not found: ID does not exist" containerID="8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.520074 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6"} err="failed to get container status \"8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6\": rpc error: code = NotFound desc = could not find container \"8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6\": container with ID starting with 8c0c2444771aa2525a3cb32410579e7f9eaf16a4c6151459228205895cb5baa6 not found: ID does not exist" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.520103 4780 scope.go:117] "RemoveContainer" containerID="1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1" Sep 29 18:46:34 crc kubenswrapper[4780]: E0929 18:46:34.520763 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1\": container with ID starting with 1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1 not found: ID does not exist" containerID="1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.520820 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1"} err="failed to get container status \"1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1\": rpc error: code = NotFound desc = could not find container \"1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1\": container with ID starting with 1dbe59609e6526189a595c5abe2ec4d473a7d3ec01142d9f34a1d4b74b1004a1 not found: ID does not exist" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.520862 4780 scope.go:117] "RemoveContainer" containerID="02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43" Sep 29 18:46:34 crc kubenswrapper[4780]: E0929 18:46:34.521517 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43\": container with ID starting with 02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43 not found: ID does not exist" containerID="02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.521551 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43"} err="failed to get container status \"02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43\": rpc error: code = NotFound desc = could not find container \"02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43\": container with ID starting with 02e7ed164d683cfe1aa927528d9f54375b577a025b77d63417e1bbbdf3e5cb43 not found: ID does not exist" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.758187 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.762462 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" path="/var/lib/kubelet/pods/1e054aba-94dc-4945-b4cd-0b7d01db39c4/volumes" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.763449 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" path="/var/lib/kubelet/pods/c21481e6-9cfa-46bf-a667-52b4a9b336d1/volumes" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.860510 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-catalog-content\") pod \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.860574 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vptp8\" (UniqueName: \"kubernetes.io/projected/8d660231-ddc9-400d-9a41-2395dfcbc3d7-kube-api-access-vptp8\") pod \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.860721 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-utilities\") pod \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\" (UID: \"8d660231-ddc9-400d-9a41-2395dfcbc3d7\") " Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.861851 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-utilities" (OuterVolumeSpecName: "utilities") pod "8d660231-ddc9-400d-9a41-2395dfcbc3d7" (UID: "8d660231-ddc9-400d-9a41-2395dfcbc3d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.870031 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d660231-ddc9-400d-9a41-2395dfcbc3d7-kube-api-access-vptp8" (OuterVolumeSpecName: "kube-api-access-vptp8") pod "8d660231-ddc9-400d-9a41-2395dfcbc3d7" (UID: "8d660231-ddc9-400d-9a41-2395dfcbc3d7"). InnerVolumeSpecName "kube-api-access-vptp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.882840 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d660231-ddc9-400d-9a41-2395dfcbc3d7" (UID: "8d660231-ddc9-400d-9a41-2395dfcbc3d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.963403 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.963462 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d660231-ddc9-400d-9a41-2395dfcbc3d7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:34 crc kubenswrapper[4780]: I0929 18:46:34.963482 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vptp8\" (UniqueName: \"kubernetes.io/projected/8d660231-ddc9-400d-9a41-2395dfcbc3d7-kube-api-access-vptp8\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.397185 4780 generic.go:334] "Generic (PLEG): container finished" podID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerID="d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414" exitCode=0 Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.397275 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zl69j" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.397303 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zl69j" event={"ID":"8d660231-ddc9-400d-9a41-2395dfcbc3d7","Type":"ContainerDied","Data":"d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414"} Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.397885 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zl69j" event={"ID":"8d660231-ddc9-400d-9a41-2395dfcbc3d7","Type":"ContainerDied","Data":"b7bc311b4957d4364ae618a8067170f48bcf0b18ecd7b4da1ec1984f8e43cc47"} Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.397911 4780 scope.go:117] "RemoveContainer" containerID="d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.421894 4780 scope.go:117] "RemoveContainer" containerID="28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.438211 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zl69j"] Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.441371 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zl69j"] Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.449678 4780 scope.go:117] "RemoveContainer" containerID="76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.466670 4780 scope.go:117] "RemoveContainer" containerID="d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414" Sep 29 18:46:35 crc kubenswrapper[4780]: E0929 18:46:35.467396 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414\": container with ID starting with d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414 not found: ID does not exist" containerID="d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.467447 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414"} err="failed to get container status \"d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414\": rpc error: code = NotFound desc = could not find container \"d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414\": container with ID starting with d164c2dca2781fc92848c24b18f56702a1850fd6f08baae14e6b4e9cbf87a414 not found: ID does not exist" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.467476 4780 scope.go:117] "RemoveContainer" containerID="28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a" Sep 29 18:46:35 crc kubenswrapper[4780]: E0929 18:46:35.467980 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a\": container with ID starting with 28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a not found: ID does not exist" containerID="28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.468010 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a"} err="failed to get container status \"28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a\": rpc error: code = NotFound desc = could not find container \"28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a\": container with ID starting with 28bcb97a684468463b04cb5a38e2d9e3f03bcd2370972f845709234ed9edb13a not found: ID does not exist" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.468032 4780 scope.go:117] "RemoveContainer" containerID="76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e" Sep 29 18:46:35 crc kubenswrapper[4780]: E0929 18:46:35.468581 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e\": container with ID starting with 76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e not found: ID does not exist" containerID="76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e" Sep 29 18:46:35 crc kubenswrapper[4780]: I0929 18:46:35.468604 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e"} err="failed to get container status \"76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e\": rpc error: code = NotFound desc = could not find container \"76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e\": container with ID starting with 76f65679b169859d29ec839846a10914afffe3d658b3fc90146a6c36e5bd250e not found: ID does not exist" Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.127839 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hms9w"] Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.128656 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hms9w" podUID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerName="registry-server" containerID="cri-o://0e898f4a8b37b27ba32d97f392ed453e210649460464fdac481669e4d2c53317" gracePeriod=2 Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.412855 4780 generic.go:334] "Generic (PLEG): container finished" podID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerID="0e898f4a8b37b27ba32d97f392ed453e210649460464fdac481669e4d2c53317" exitCode=0 Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.412927 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hms9w" event={"ID":"5ccd51ee-2519-477d-bb0f-182d3837fa0f","Type":"ContainerDied","Data":"0e898f4a8b37b27ba32d97f392ed453e210649460464fdac481669e4d2c53317"} Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.484482 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.596943 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fbvb\" (UniqueName: \"kubernetes.io/projected/5ccd51ee-2519-477d-bb0f-182d3837fa0f-kube-api-access-6fbvb\") pod \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.597435 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-catalog-content\") pod \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.597464 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-utilities\") pod \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\" (UID: \"5ccd51ee-2519-477d-bb0f-182d3837fa0f\") " Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.598702 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-utilities" (OuterVolumeSpecName: "utilities") pod "5ccd51ee-2519-477d-bb0f-182d3837fa0f" (UID: "5ccd51ee-2519-477d-bb0f-182d3837fa0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.618510 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ccd51ee-2519-477d-bb0f-182d3837fa0f-kube-api-access-6fbvb" (OuterVolumeSpecName: "kube-api-access-6fbvb") pod "5ccd51ee-2519-477d-bb0f-182d3837fa0f" (UID: "5ccd51ee-2519-477d-bb0f-182d3837fa0f"). InnerVolumeSpecName "kube-api-access-6fbvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.692665 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ccd51ee-2519-477d-bb0f-182d3837fa0f" (UID: "5ccd51ee-2519-477d-bb0f-182d3837fa0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.699128 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.699161 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ccd51ee-2519-477d-bb0f-182d3837fa0f-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.699173 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fbvb\" (UniqueName: \"kubernetes.io/projected/5ccd51ee-2519-477d-bb0f-182d3837fa0f-kube-api-access-6fbvb\") on node \"crc\" DevicePath \"\"" Sep 29 18:46:36 crc kubenswrapper[4780]: I0929 18:46:36.760929 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" path="/var/lib/kubelet/pods/8d660231-ddc9-400d-9a41-2395dfcbc3d7/volumes" Sep 29 18:46:37 crc kubenswrapper[4780]: I0929 18:46:37.423354 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hms9w" event={"ID":"5ccd51ee-2519-477d-bb0f-182d3837fa0f","Type":"ContainerDied","Data":"226a74f85799c1ad2e0fd378fb8325eda45dac8505ed974c3bf98512ad8f0ac2"} Sep 29 18:46:37 crc kubenswrapper[4780]: I0929 18:46:37.424895 4780 scope.go:117] "RemoveContainer" containerID="0e898f4a8b37b27ba32d97f392ed453e210649460464fdac481669e4d2c53317" Sep 29 18:46:37 crc kubenswrapper[4780]: I0929 18:46:37.423763 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hms9w" Sep 29 18:46:37 crc kubenswrapper[4780]: I0929 18:46:37.453603 4780 scope.go:117] "RemoveContainer" containerID="b03195390c48c298ed73bd149e953fb45fa989483f24c3b5f8c07a54d919b1bd" Sep 29 18:46:37 crc kubenswrapper[4780]: I0929 18:46:37.470108 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hms9w"] Sep 29 18:46:37 crc kubenswrapper[4780]: I0929 18:46:37.471673 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hms9w"] Sep 29 18:46:37 crc kubenswrapper[4780]: I0929 18:46:37.473231 4780 scope.go:117] "RemoveContainer" containerID="5ab1cab88eebe4b603131db030dd3586441c8a4a22f8e8bc0be227f9a37410f6" Sep 29 18:46:38 crc kubenswrapper[4780]: I0929 18:46:38.759349 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" path="/var/lib/kubelet/pods/5ccd51ee-2519-477d-bb0f-182d3837fa0f/volumes" Sep 29 18:47:01 crc kubenswrapper[4780]: I0929 18:47:01.497221 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6qtf"] Sep 29 18:47:03 crc kubenswrapper[4780]: I0929 18:47:03.223860 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:47:03 crc kubenswrapper[4780]: I0929 18:47:03.224241 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:47:03 crc kubenswrapper[4780]: I0929 18:47:03.224311 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:47:03 crc kubenswrapper[4780]: I0929 18:47:03.225160 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 18:47:03 crc kubenswrapper[4780]: I0929 18:47:03.225262 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5" gracePeriod=600 Sep 29 18:47:03 crc kubenswrapper[4780]: I0929 18:47:03.594716 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5" exitCode=0 Sep 29 18:47:03 crc kubenswrapper[4780]: I0929 18:47:03.594944 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5"} Sep 29 18:47:03 crc kubenswrapper[4780]: I0929 18:47:03.595250 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"c05a7039d50100743fdb64b2263f4cfb1bbcf6e2c50d90eb29f232397ee36e3b"} Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.529311 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" podUID="ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" containerName="oauth-openshift" containerID="cri-o://52e4b79e248b284e7fa1d037412a9740a36f93f45f76d62188a157ae938368a1" gracePeriod=15 Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.785342 4780 generic.go:334] "Generic (PLEG): container finished" podID="ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" containerID="52e4b79e248b284e7fa1d037412a9740a36f93f45f76d62188a157ae938368a1" exitCode=0 Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.785609 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" event={"ID":"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019","Type":"ContainerDied","Data":"52e4b79e248b284e7fa1d037412a9740a36f93f45f76d62188a157ae938368a1"} Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.878530 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914627 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-55889b984c-xgxll"] Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914852 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d120ffc1-61fe-49e0-9775-e24e4356f900" containerName="pruner" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914865 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d120ffc1-61fe-49e0-9775-e24e4356f900" containerName="pruner" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914874 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914880 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914888 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerName="extract-utilities" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914894 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerName="extract-utilities" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914906 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerName="extract-content" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914913 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerName="extract-content" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914923 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914929 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914937 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914943 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914954 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914962 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914971 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerName="extract-utilities" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914977 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerName="extract-utilities" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914985 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerName="extract-utilities" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.914991 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerName="extract-utilities" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.914998 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerName="extract-content" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915004 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerName="extract-content" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.915012 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerName="extract-content" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915018 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerName="extract-content" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.915024 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerName="extract-content" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915030 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerName="extract-content" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.915036 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" containerName="oauth-openshift" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915057 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" containerName="oauth-openshift" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.915065 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerName="extract-utilities" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915070 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerName="extract-utilities" Sep 29 18:47:26 crc kubenswrapper[4780]: E0929 18:47:26.915078 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46b3f97-035d-4966-b616-30e6f1f20d7a" containerName="pruner" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915084 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46b3f97-035d-4966-b616-30e6f1f20d7a" containerName="pruner" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915185 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d660231-ddc9-400d-9a41-2395dfcbc3d7" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915201 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e054aba-94dc-4945-b4cd-0b7d01db39c4" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915208 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" containerName="oauth-openshift" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915215 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46b3f97-035d-4966-b616-30e6f1f20d7a" containerName="pruner" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915223 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21481e6-9cfa-46bf-a667-52b4a9b336d1" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915229 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d120ffc1-61fe-49e0-9775-e24e4356f900" containerName="pruner" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915236 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ccd51ee-2519-477d-bb0f-182d3837fa0f" containerName="registry-server" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.915622 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.933131 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55889b984c-xgxll"] Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.958448 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-audit-policies\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.958549 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd37356f-873a-41bf-8f84-17c0fd2c1387-audit-dir\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:26 crc kubenswrapper[4780]: I0929 18:47:26.958591 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-session\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059421 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-serving-cert\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059483 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-trusted-ca-bundle\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059521 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-idp-0-file-data\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059593 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-dir\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059633 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-ocp-branding-template\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059697 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-router-certs\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059751 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-policies\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059724 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059776 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-error\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.059968 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-session\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.060136 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-provider-selection\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.060236 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-login\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.060317 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-cliconfig\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.060375 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hscc\" (UniqueName: \"kubernetes.io/projected/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-kube-api-access-4hscc\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.060415 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-service-ca\") pod \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\" (UID: \"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019\") " Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.060825 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-audit-policies\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.060916 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061037 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061152 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061194 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd37356f-873a-41bf-8f84-17c0fd2c1387-audit-dir\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061268 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bd37356f-873a-41bf-8f84-17c0fd2c1387-audit-dir\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061313 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-session\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061482 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-router-certs\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061544 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-service-ca\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061582 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-template-error\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061697 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2hmj\" (UniqueName: \"kubernetes.io/projected/bd37356f-873a-41bf-8f84-17c0fd2c1387-kube-api-access-f2hmj\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061761 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-template-login\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.061840 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.062096 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.062136 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.062175 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.062334 4780 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.062355 4780 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.062415 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.062721 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-audit-policies\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.063150 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.063421 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.068204 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-session\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.068351 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.069017 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.070604 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.070953 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-kube-api-access-4hscc" (OuterVolumeSpecName: "kube-api-access-4hscc") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "kube-api-access-4hscc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.071133 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.071552 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.071900 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.073752 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.074542 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" (UID: "ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.163850 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.163954 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-router-certs\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.163982 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-service-ca\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164005 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-template-error\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164116 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2hmj\" (UniqueName: \"kubernetes.io/projected/bd37356f-873a-41bf-8f84-17c0fd2c1387-kube-api-access-f2hmj\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164145 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-template-login\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164175 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164191 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164212 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164232 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164261 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164307 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164322 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164335 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164346 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hscc\" (UniqueName: \"kubernetes.io/projected/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-kube-api-access-4hscc\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164358 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164369 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164381 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164394 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164407 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164421 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164432 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.164444 4780 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.166401 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-service-ca\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.167361 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.167383 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.167894 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.169038 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.169504 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-template-error\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.169723 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.171028 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-user-template-login\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.171745 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-router-certs\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.172297 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd37356f-873a-41bf-8f84-17c0fd2c1387-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.186470 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2hmj\" (UniqueName: \"kubernetes.io/projected/bd37356f-873a-41bf-8f84-17c0fd2c1387-kube-api-access-f2hmj\") pod \"oauth-openshift-55889b984c-xgxll\" (UID: \"bd37356f-873a-41bf-8f84-17c0fd2c1387\") " pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.239580 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.458333 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55889b984c-xgxll"] Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.796205 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.796391 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-n6qtf" event={"ID":"ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019","Type":"ContainerDied","Data":"e6af6f211b2f7708ff750bba4fc6867a0d4b97da2d6b4ea2b038e8b086fa176c"} Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.796817 4780 scope.go:117] "RemoveContainer" containerID="52e4b79e248b284e7fa1d037412a9740a36f93f45f76d62188a157ae938368a1" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.799867 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" event={"ID":"bd37356f-873a-41bf-8f84-17c0fd2c1387","Type":"ContainerStarted","Data":"8290d87cb79c95e353453d20a570ad70af32fbda91aab6f4d6e1a9dc7f757008"} Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.799921 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" event={"ID":"bd37356f-873a-41bf-8f84-17c0fd2c1387","Type":"ContainerStarted","Data":"4413a1531b03b9f877b2c52eb2fa57dd2698259e13c35092957827259a36e33b"} Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.800129 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.829978 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" podStartSLOduration=26.829946314 podStartE2EDuration="26.829946314s" podCreationTimestamp="2025-09-29 18:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:47:27.828628463 +0000 UTC m=+247.776926517" watchObservedRunningTime="2025-09-29 18:47:27.829946314 +0000 UTC m=+247.778244378" Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.854674 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6qtf"] Sep 29 18:47:27 crc kubenswrapper[4780]: I0929 18:47:27.861345 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-n6qtf"] Sep 29 18:47:28 crc kubenswrapper[4780]: I0929 18:47:28.360742 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-55889b984c-xgxll" Sep 29 18:47:28 crc kubenswrapper[4780]: I0929 18:47:28.762287 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019" path="/var/lib/kubelet/pods/ac14e5ca-d14b-4f77-9cd0-3aba2a3f3019/volumes" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.299387 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qtrcx"] Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.301217 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qtrcx" podUID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerName="registry-server" containerID="cri-o://1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7" gracePeriod=30 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.315058 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f2g2b"] Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.315387 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f2g2b" podUID="3fcaee64-db78-46be-a54e-8412e4394681" containerName="registry-server" containerID="cri-o://7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9" gracePeriod=30 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.325843 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-slln5"] Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.326152 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" podUID="78d8c0c3-f516-48ef-8279-a2e9e0c04835" containerName="marketplace-operator" containerID="cri-o://505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9" gracePeriod=30 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.337568 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqdk7"] Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.339810 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hqdk7" podUID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerName="registry-server" containerID="cri-o://f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4" gracePeriod=30 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.351568 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8bcsn"] Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.351936 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8bcsn" podUID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerName="registry-server" containerID="cri-o://c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a" gracePeriod=30 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.355124 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ld5pv"] Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.355916 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.377317 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ld5pv"] Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.441266 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cfd83e-4c6c-4e46-8981-81d25b08d81e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ld5pv\" (UID: \"b6cfd83e-4c6c-4e46-8981-81d25b08d81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.441353 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cfd83e-4c6c-4e46-8981-81d25b08d81e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ld5pv\" (UID: \"b6cfd83e-4c6c-4e46-8981-81d25b08d81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.441414 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl7mb\" (UniqueName: \"kubernetes.io/projected/b6cfd83e-4c6c-4e46-8981-81d25b08d81e-kube-api-access-pl7mb\") pod \"marketplace-operator-79b997595-ld5pv\" (UID: \"b6cfd83e-4c6c-4e46-8981-81d25b08d81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.545597 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cfd83e-4c6c-4e46-8981-81d25b08d81e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ld5pv\" (UID: \"b6cfd83e-4c6c-4e46-8981-81d25b08d81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.545700 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cfd83e-4c6c-4e46-8981-81d25b08d81e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ld5pv\" (UID: \"b6cfd83e-4c6c-4e46-8981-81d25b08d81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.546026 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl7mb\" (UniqueName: \"kubernetes.io/projected/b6cfd83e-4c6c-4e46-8981-81d25b08d81e-kube-api-access-pl7mb\") pod \"marketplace-operator-79b997595-ld5pv\" (UID: \"b6cfd83e-4c6c-4e46-8981-81d25b08d81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.549958 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cfd83e-4c6c-4e46-8981-81d25b08d81e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ld5pv\" (UID: \"b6cfd83e-4c6c-4e46-8981-81d25b08d81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.557230 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cfd83e-4c6c-4e46-8981-81d25b08d81e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ld5pv\" (UID: \"b6cfd83e-4c6c-4e46-8981-81d25b08d81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.571198 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl7mb\" (UniqueName: \"kubernetes.io/projected/b6cfd83e-4c6c-4e46-8981-81d25b08d81e-kube-api-access-pl7mb\") pod \"marketplace-operator-79b997595-ld5pv\" (UID: \"b6cfd83e-4c6c-4e46-8981-81d25b08d81e\") " pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: E0929 18:47:39.658023 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9 is running failed: container process not found" containerID="7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9" cmd=["grpc_health_probe","-addr=:50051"] Sep 29 18:47:39 crc kubenswrapper[4780]: E0929 18:47:39.658474 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9 is running failed: container process not found" containerID="7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9" cmd=["grpc_health_probe","-addr=:50051"] Sep 29 18:47:39 crc kubenswrapper[4780]: E0929 18:47:39.659256 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9 is running failed: container process not found" containerID="7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9" cmd=["grpc_health_probe","-addr=:50051"] Sep 29 18:47:39 crc kubenswrapper[4780]: E0929 18:47:39.659293 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-f2g2b" podUID="3fcaee64-db78-46be-a54e-8412e4394681" containerName="registry-server" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.714127 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.745811 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.812870 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.813540 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.853633 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-catalog-content\") pod \"4b71dbf4-1e39-4222-bfb1-ccec82699848\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.853911 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-utilities\") pod \"4b71dbf4-1e39-4222-bfb1-ccec82699848\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.853936 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9mhk\" (UniqueName: \"kubernetes.io/projected/4b71dbf4-1e39-4222-bfb1-ccec82699848-kube-api-access-v9mhk\") pod \"4b71dbf4-1e39-4222-bfb1-ccec82699848\" (UID: \"4b71dbf4-1e39-4222-bfb1-ccec82699848\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.856435 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-utilities" (OuterVolumeSpecName: "utilities") pod "4b71dbf4-1e39-4222-bfb1-ccec82699848" (UID: "4b71dbf4-1e39-4222-bfb1-ccec82699848"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.859489 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b71dbf4-1e39-4222-bfb1-ccec82699848-kube-api-access-v9mhk" (OuterVolumeSpecName: "kube-api-access-v9mhk") pod "4b71dbf4-1e39-4222-bfb1-ccec82699848" (UID: "4b71dbf4-1e39-4222-bfb1-ccec82699848"). InnerVolumeSpecName "kube-api-access-v9mhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.861102 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.893021 4780 generic.go:334] "Generic (PLEG): container finished" podID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerID="1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7" exitCode=0 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.893153 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtrcx" event={"ID":"4b71dbf4-1e39-4222-bfb1-ccec82699848","Type":"ContainerDied","Data":"1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.893193 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtrcx" event={"ID":"4b71dbf4-1e39-4222-bfb1-ccec82699848","Type":"ContainerDied","Data":"1ccc6c55a0edad2d0a7e7e196bec54edb2ed767d90490d69d85f2d29a44a6a60"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.893213 4780 scope.go:117] "RemoveContainer" containerID="1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.893381 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtrcx" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.896742 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.901932 4780 generic.go:334] "Generic (PLEG): container finished" podID="3fcaee64-db78-46be-a54e-8412e4394681" containerID="7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9" exitCode=0 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.902004 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2g2b" event={"ID":"3fcaee64-db78-46be-a54e-8412e4394681","Type":"ContainerDied","Data":"7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.902036 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2g2b" event={"ID":"3fcaee64-db78-46be-a54e-8412e4394681","Type":"ContainerDied","Data":"2fb79a636e33ad6294493a19abf9dd59665d338b5b2d09eba83ba423974fe894"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.902119 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2g2b" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.903754 4780 generic.go:334] "Generic (PLEG): container finished" podID="78d8c0c3-f516-48ef-8279-a2e9e0c04835" containerID="505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9" exitCode=0 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.903797 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" event={"ID":"78d8c0c3-f516-48ef-8279-a2e9e0c04835","Type":"ContainerDied","Data":"505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.903813 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" event={"ID":"78d8c0c3-f516-48ef-8279-a2e9e0c04835","Type":"ContainerDied","Data":"bc23b42711602542d1413450ef02684a04b26267e48acd638f0ce028f4cb3772"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.903853 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-slln5" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.914444 4780 generic.go:334] "Generic (PLEG): container finished" podID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerID="c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a" exitCode=0 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.914584 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bcsn" event={"ID":"725c40c0-e9f5-4caf-9aac-812bf777bf8b","Type":"ContainerDied","Data":"c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.914618 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bcsn" event={"ID":"725c40c0-e9f5-4caf-9aac-812bf777bf8b","Type":"ContainerDied","Data":"a4016787c14e0f90a5c1b349dfba7f1e46b16fc1599a3abca50417612ea068b8"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.914704 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bcsn" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.918687 4780 scope.go:117] "RemoveContainer" containerID="eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.930308 4780 generic.go:334] "Generic (PLEG): container finished" podID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerID="f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4" exitCode=0 Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.930371 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqdk7" event={"ID":"9352c838-865a-4ef4-ae6f-e6b49ef46fa2","Type":"ContainerDied","Data":"f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.930414 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqdk7" event={"ID":"9352c838-865a-4ef4-ae6f-e6b49ef46fa2","Type":"ContainerDied","Data":"9885863f9a370fca9b5af6395707d7152dd1822b6d6a3026fff77ff450a3e3dc"} Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.930519 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqdk7" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.952363 4780 scope.go:117] "RemoveContainer" containerID="cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.954931 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-utilities\") pod \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.954963 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-utilities\") pod \"3fcaee64-db78-46be-a54e-8412e4394681\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.955091 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-catalog-content\") pod \"3fcaee64-db78-46be-a54e-8412e4394681\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.955121 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmhmj\" (UniqueName: \"kubernetes.io/projected/725c40c0-e9f5-4caf-9aac-812bf777bf8b-kube-api-access-nmhmj\") pod \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.955152 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhdfq\" (UniqueName: \"kubernetes.io/projected/78d8c0c3-f516-48ef-8279-a2e9e0c04835-kube-api-access-jhdfq\") pod \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.955184 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-catalog-content\") pod \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\" (UID: \"725c40c0-e9f5-4caf-9aac-812bf777bf8b\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.955219 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-operator-metrics\") pod \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.955261 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-trusted-ca\") pod \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\" (UID: \"78d8c0c3-f516-48ef-8279-a2e9e0c04835\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.955350 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqdqn\" (UniqueName: \"kubernetes.io/projected/3fcaee64-db78-46be-a54e-8412e4394681-kube-api-access-vqdqn\") pod \"3fcaee64-db78-46be-a54e-8412e4394681\" (UID: \"3fcaee64-db78-46be-a54e-8412e4394681\") " Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.955618 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.955639 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9mhk\" (UniqueName: \"kubernetes.io/projected/4b71dbf4-1e39-4222-bfb1-ccec82699848-kube-api-access-v9mhk\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.956594 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "78d8c0c3-f516-48ef-8279-a2e9e0c04835" (UID: "78d8c0c3-f516-48ef-8279-a2e9e0c04835"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.960733 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d8c0c3-f516-48ef-8279-a2e9e0c04835-kube-api-access-jhdfq" (OuterVolumeSpecName: "kube-api-access-jhdfq") pod "78d8c0c3-f516-48ef-8279-a2e9e0c04835" (UID: "78d8c0c3-f516-48ef-8279-a2e9e0c04835"). InnerVolumeSpecName "kube-api-access-jhdfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.961940 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fcaee64-db78-46be-a54e-8412e4394681-kube-api-access-vqdqn" (OuterVolumeSpecName: "kube-api-access-vqdqn") pod "3fcaee64-db78-46be-a54e-8412e4394681" (UID: "3fcaee64-db78-46be-a54e-8412e4394681"). InnerVolumeSpecName "kube-api-access-vqdqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.963261 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "78d8c0c3-f516-48ef-8279-a2e9e0c04835" (UID: "78d8c0c3-f516-48ef-8279-a2e9e0c04835"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.964302 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-utilities" (OuterVolumeSpecName: "utilities") pod "3fcaee64-db78-46be-a54e-8412e4394681" (UID: "3fcaee64-db78-46be-a54e-8412e4394681"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.979881 4780 scope.go:117] "RemoveContainer" containerID="1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7" Sep 29 18:47:39 crc kubenswrapper[4780]: E0929 18:47:39.980439 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7\": container with ID starting with 1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7 not found: ID does not exist" containerID="1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.980496 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7"} err="failed to get container status \"1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7\": rpc error: code = NotFound desc = could not find container \"1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7\": container with ID starting with 1597124aa58f6c583be3addd6251d37bea71c6d382bec6ea20cf259fffde58c7 not found: ID does not exist" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.980535 4780 scope.go:117] "RemoveContainer" containerID="eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0" Sep 29 18:47:39 crc kubenswrapper[4780]: E0929 18:47:39.980868 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0\": container with ID starting with eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0 not found: ID does not exist" containerID="eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.980904 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0"} err="failed to get container status \"eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0\": rpc error: code = NotFound desc = could not find container \"eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0\": container with ID starting with eb63e56b16dd1353f2699be1ebe522594b3d11f7032cfe82772d34e4b86a33c0 not found: ID does not exist" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.980922 4780 scope.go:117] "RemoveContainer" containerID="cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.981006 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b71dbf4-1e39-4222-bfb1-ccec82699848" (UID: "4b71dbf4-1e39-4222-bfb1-ccec82699848"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: E0929 18:47:39.981242 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e\": container with ID starting with cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e not found: ID does not exist" containerID="cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.981286 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e"} err="failed to get container status \"cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e\": rpc error: code = NotFound desc = could not find container \"cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e\": container with ID starting with cc194d21a26e07c3bc1ec0727cf2e5b8935f9d7ba9f81c94b13d92b971c5d73e not found: ID does not exist" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.981314 4780 scope.go:117] "RemoveContainer" containerID="7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.983357 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725c40c0-e9f5-4caf-9aac-812bf777bf8b-kube-api-access-nmhmj" (OuterVolumeSpecName: "kube-api-access-nmhmj") pod "725c40c0-e9f5-4caf-9aac-812bf777bf8b" (UID: "725c40c0-e9f5-4caf-9aac-812bf777bf8b"). InnerVolumeSpecName "kube-api-access-nmhmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.985841 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-utilities" (OuterVolumeSpecName: "utilities") pod "725c40c0-e9f5-4caf-9aac-812bf777bf8b" (UID: "725c40c0-e9f5-4caf-9aac-812bf777bf8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:47:39 crc kubenswrapper[4780]: I0929 18:47:39.998887 4780 scope.go:117] "RemoveContainer" containerID="5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.015071 4780 scope.go:117] "RemoveContainer" containerID="bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.017879 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fcaee64-db78-46be-a54e-8412e4394681" (UID: "3fcaee64-db78-46be-a54e-8412e4394681"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.043819 4780 scope.go:117] "RemoveContainer" containerID="7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.044569 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9\": container with ID starting with 7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9 not found: ID does not exist" containerID="7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.044631 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9"} err="failed to get container status \"7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9\": rpc error: code = NotFound desc = could not find container \"7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9\": container with ID starting with 7bb7f18107dc4485bea89bfe108f12a12d7f37ff14297e69153736b97025c8a9 not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.044665 4780 scope.go:117] "RemoveContainer" containerID="5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.045028 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026\": container with ID starting with 5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026 not found: ID does not exist" containerID="5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.045114 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026"} err="failed to get container status \"5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026\": rpc error: code = NotFound desc = could not find container \"5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026\": container with ID starting with 5bfec63d06c6548db43c0dc725f170d7ae63e53bd74743b47f948dee13f33026 not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.045153 4780 scope.go:117] "RemoveContainer" containerID="bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.045507 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6\": container with ID starting with bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6 not found: ID does not exist" containerID="bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.045540 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6"} err="failed to get container status \"bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6\": rpc error: code = NotFound desc = could not find container \"bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6\": container with ID starting with bf25c1aa98cc309300b10f4371569f4383e3406138e3319acc36ad8e63982bf6 not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.045557 4780 scope.go:117] "RemoveContainer" containerID="505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.056959 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv7cp\" (UniqueName: \"kubernetes.io/projected/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-kube-api-access-hv7cp\") pod \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057104 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-catalog-content\") pod \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057193 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-utilities\") pod \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\" (UID: \"9352c838-865a-4ef4-ae6f-e6b49ef46fa2\") " Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057532 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqdqn\" (UniqueName: \"kubernetes.io/projected/3fcaee64-db78-46be-a54e-8412e4394681-kube-api-access-vqdqn\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057553 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057567 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057580 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b71dbf4-1e39-4222-bfb1-ccec82699848-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057592 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fcaee64-db78-46be-a54e-8412e4394681-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057605 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmhmj\" (UniqueName: \"kubernetes.io/projected/725c40c0-e9f5-4caf-9aac-812bf777bf8b-kube-api-access-nmhmj\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057615 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhdfq\" (UniqueName: \"kubernetes.io/projected/78d8c0c3-f516-48ef-8279-a2e9e0c04835-kube-api-access-jhdfq\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057626 4780 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.057639 4780 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d8c0c3-f516-48ef-8279-a2e9e0c04835-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.058327 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-utilities" (OuterVolumeSpecName: "utilities") pod "9352c838-865a-4ef4-ae6f-e6b49ef46fa2" (UID: "9352c838-865a-4ef4-ae6f-e6b49ef46fa2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.059457 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "725c40c0-e9f5-4caf-9aac-812bf777bf8b" (UID: "725c40c0-e9f5-4caf-9aac-812bf777bf8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.064187 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-kube-api-access-hv7cp" (OuterVolumeSpecName: "kube-api-access-hv7cp") pod "9352c838-865a-4ef4-ae6f-e6b49ef46fa2" (UID: "9352c838-865a-4ef4-ae6f-e6b49ef46fa2"). InnerVolumeSpecName "kube-api-access-hv7cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.064241 4780 scope.go:117] "RemoveContainer" containerID="505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.066217 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9\": container with ID starting with 505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9 not found: ID does not exist" containerID="505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.066270 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9"} err="failed to get container status \"505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9\": rpc error: code = NotFound desc = could not find container \"505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9\": container with ID starting with 505b98725cf27f80b975afb25c60e24f9e948ffe20cdeb78d058f44e2b1c9ee9 not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.066315 4780 scope.go:117] "RemoveContainer" containerID="c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.073011 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9352c838-865a-4ef4-ae6f-e6b49ef46fa2" (UID: "9352c838-865a-4ef4-ae6f-e6b49ef46fa2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.083365 4780 scope.go:117] "RemoveContainer" containerID="d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.103704 4780 scope.go:117] "RemoveContainer" containerID="911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.119750 4780 scope.go:117] "RemoveContainer" containerID="c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.120646 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a\": container with ID starting with c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a not found: ID does not exist" containerID="c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.120713 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a"} err="failed to get container status \"c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a\": rpc error: code = NotFound desc = could not find container \"c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a\": container with ID starting with c346a28edbcd83a0547390b78ed63347c13b67705561efa18bf185674c753d7a not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.120766 4780 scope.go:117] "RemoveContainer" containerID="d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.121309 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda\": container with ID starting with d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda not found: ID does not exist" containerID="d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.121459 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda"} err="failed to get container status \"d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda\": rpc error: code = NotFound desc = could not find container \"d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda\": container with ID starting with d100730c6a277705764992479801c3b94533d3ce6a75baa54017665ea002dfda not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.121585 4780 scope.go:117] "RemoveContainer" containerID="911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.122126 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d\": container with ID starting with 911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d not found: ID does not exist" containerID="911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.122166 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d"} err="failed to get container status \"911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d\": rpc error: code = NotFound desc = could not find container \"911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d\": container with ID starting with 911b7841183a1a9c2e0af0d3fae7889744a3b58c48feede0635a6d5e4fc7867d not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.122188 4780 scope.go:117] "RemoveContainer" containerID="f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.138983 4780 scope.go:117] "RemoveContainer" containerID="e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.158628 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv7cp\" (UniqueName: \"kubernetes.io/projected/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-kube-api-access-hv7cp\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.158658 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.158668 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9352c838-865a-4ef4-ae6f-e6b49ef46fa2-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.158884 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/725c40c0-e9f5-4caf-9aac-812bf777bf8b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.178170 4780 scope.go:117] "RemoveContainer" containerID="68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.195607 4780 scope.go:117] "RemoveContainer" containerID="f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.196801 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4\": container with ID starting with f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4 not found: ID does not exist" containerID="f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.196852 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4"} err="failed to get container status \"f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4\": rpc error: code = NotFound desc = could not find container \"f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4\": container with ID starting with f2a479ede5854f8c347aab6da3b41ac3b62622e6adb84073194e7301529ec3c4 not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.196889 4780 scope.go:117] "RemoveContainer" containerID="e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.197326 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6\": container with ID starting with e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6 not found: ID does not exist" containerID="e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.197376 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6"} err="failed to get container status \"e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6\": rpc error: code = NotFound desc = could not find container \"e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6\": container with ID starting with e5ec5f26989a9453d859f7fe07c22b75b81ecef109848f94853a776bc75a37b6 not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.197452 4780 scope.go:117] "RemoveContainer" containerID="68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339" Sep 29 18:47:40 crc kubenswrapper[4780]: E0929 18:47:40.197902 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339\": container with ID starting with 68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339 not found: ID does not exist" containerID="68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.197924 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339"} err="failed to get container status \"68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339\": rpc error: code = NotFound desc = could not find container \"68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339\": container with ID starting with 68e93c11f91788e9d9e68ca00f1b1554bee47e02127cc3c69a534708abf2d339 not found: ID does not exist" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.202210 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ld5pv"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.238341 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qtrcx"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.260253 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qtrcx"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.267599 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-slln5"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.272842 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-slln5"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.278127 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f2g2b"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.285127 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f2g2b"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.299722 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8bcsn"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.303486 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8bcsn"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.307260 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqdk7"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.309440 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqdk7"] Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.762364 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fcaee64-db78-46be-a54e-8412e4394681" path="/var/lib/kubelet/pods/3fcaee64-db78-46be-a54e-8412e4394681/volumes" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.763149 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b71dbf4-1e39-4222-bfb1-ccec82699848" path="/var/lib/kubelet/pods/4b71dbf4-1e39-4222-bfb1-ccec82699848/volumes" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.763710 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" path="/var/lib/kubelet/pods/725c40c0-e9f5-4caf-9aac-812bf777bf8b/volumes" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.764339 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d8c0c3-f516-48ef-8279-a2e9e0c04835" path="/var/lib/kubelet/pods/78d8c0c3-f516-48ef-8279-a2e9e0c04835/volumes" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.764766 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" path="/var/lib/kubelet/pods/9352c838-865a-4ef4-ae6f-e6b49ef46fa2/volumes" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.940234 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" event={"ID":"b6cfd83e-4c6c-4e46-8981-81d25b08d81e","Type":"ContainerStarted","Data":"3c7454d0a39f5d945ed92d4c903d945a0e5af8b3950e4ccb133729614eb5adb6"} Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.940825 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" event={"ID":"b6cfd83e-4c6c-4e46-8981-81d25b08d81e","Type":"ContainerStarted","Data":"816a26f29f018ed65ddfa7239d95f57b298010d50e948bceee9f78849c446266"} Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.940852 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.953318 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" Sep 29 18:47:40 crc kubenswrapper[4780]: I0929 18:47:40.959713 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ld5pv" podStartSLOduration=1.959675584 podStartE2EDuration="1.959675584s" podCreationTimestamp="2025-09-29 18:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:47:40.956686802 +0000 UTC m=+260.904984846" watchObservedRunningTime="2025-09-29 18:47:40.959675584 +0000 UTC m=+260.907973628" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.317786 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-68x8k"] Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318020 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerName="extract-content" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318062 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerName="extract-content" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318074 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcaee64-db78-46be-a54e-8412e4394681" containerName="extract-utilities" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318081 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcaee64-db78-46be-a54e-8412e4394681" containerName="extract-utilities" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318090 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerName="extract-utilities" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318097 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerName="extract-utilities" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318106 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcaee64-db78-46be-a54e-8412e4394681" containerName="extract-content" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318112 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcaee64-db78-46be-a54e-8412e4394681" containerName="extract-content" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318119 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d8c0c3-f516-48ef-8279-a2e9e0c04835" containerName="marketplace-operator" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318125 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d8c0c3-f516-48ef-8279-a2e9e0c04835" containerName="marketplace-operator" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318133 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerName="extract-utilities" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318139 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerName="extract-utilities" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318150 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerName="extract-utilities" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318156 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerName="extract-utilities" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318166 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcaee64-db78-46be-a54e-8412e4394681" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318172 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcaee64-db78-46be-a54e-8412e4394681" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318182 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318187 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318195 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318214 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318222 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerName="extract-content" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318228 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerName="extract-content" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318235 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318241 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: E0929 18:47:41.318248 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerName="extract-content" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318256 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerName="extract-content" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318346 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fcaee64-db78-46be-a54e-8412e4394681" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318358 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d8c0c3-f516-48ef-8279-a2e9e0c04835" containerName="marketplace-operator" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318369 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="725c40c0-e9f5-4caf-9aac-812bf777bf8b" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318380 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="9352c838-865a-4ef4-ae6f-e6b49ef46fa2" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.318388 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b71dbf4-1e39-4222-bfb1-ccec82699848" containerName="registry-server" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.319206 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.321626 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.332465 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-68x8k"] Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.492641 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0dc84e5-abe0-4e53-813c-0363cf9de12f-catalog-content\") pod \"redhat-marketplace-68x8k\" (UID: \"b0dc84e5-abe0-4e53-813c-0363cf9de12f\") " pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.492716 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt2bd\" (UniqueName: \"kubernetes.io/projected/b0dc84e5-abe0-4e53-813c-0363cf9de12f-kube-api-access-xt2bd\") pod \"redhat-marketplace-68x8k\" (UID: \"b0dc84e5-abe0-4e53-813c-0363cf9de12f\") " pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.492761 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0dc84e5-abe0-4e53-813c-0363cf9de12f-utilities\") pod \"redhat-marketplace-68x8k\" (UID: \"b0dc84e5-abe0-4e53-813c-0363cf9de12f\") " pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.594001 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt2bd\" (UniqueName: \"kubernetes.io/projected/b0dc84e5-abe0-4e53-813c-0363cf9de12f-kube-api-access-xt2bd\") pod \"redhat-marketplace-68x8k\" (UID: \"b0dc84e5-abe0-4e53-813c-0363cf9de12f\") " pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.594121 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0dc84e5-abe0-4e53-813c-0363cf9de12f-utilities\") pod \"redhat-marketplace-68x8k\" (UID: \"b0dc84e5-abe0-4e53-813c-0363cf9de12f\") " pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.594177 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0dc84e5-abe0-4e53-813c-0363cf9de12f-catalog-content\") pod \"redhat-marketplace-68x8k\" (UID: \"b0dc84e5-abe0-4e53-813c-0363cf9de12f\") " pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.594783 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0dc84e5-abe0-4e53-813c-0363cf9de12f-catalog-content\") pod \"redhat-marketplace-68x8k\" (UID: \"b0dc84e5-abe0-4e53-813c-0363cf9de12f\") " pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.594905 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0dc84e5-abe0-4e53-813c-0363cf9de12f-utilities\") pod \"redhat-marketplace-68x8k\" (UID: \"b0dc84e5-abe0-4e53-813c-0363cf9de12f\") " pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.621570 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt2bd\" (UniqueName: \"kubernetes.io/projected/b0dc84e5-abe0-4e53-813c-0363cf9de12f-kube-api-access-xt2bd\") pod \"redhat-marketplace-68x8k\" (UID: \"b0dc84e5-abe0-4e53-813c-0363cf9de12f\") " pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.637884 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.873008 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-68x8k"] Sep 29 18:47:41 crc kubenswrapper[4780]: W0929 18:47:41.881238 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0dc84e5_abe0_4e53_813c_0363cf9de12f.slice/crio-fc174069de589f3fe57b3c685577ddd4aa880936a1dce9db3b2703c5abf782c6 WatchSource:0}: Error finding container fc174069de589f3fe57b3c685577ddd4aa880936a1dce9db3b2703c5abf782c6: Status 404 returned error can't find the container with id fc174069de589f3fe57b3c685577ddd4aa880936a1dce9db3b2703c5abf782c6 Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.915070 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xzf5k"] Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.922816 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.937559 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.937793 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xzf5k"] Sep 29 18:47:41 crc kubenswrapper[4780]: I0929 18:47:41.966174 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68x8k" event={"ID":"b0dc84e5-abe0-4e53-813c-0363cf9de12f","Type":"ContainerStarted","Data":"fc174069de589f3fe57b3c685577ddd4aa880936a1dce9db3b2703c5abf782c6"} Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.101900 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff6fc42-ccc4-4d32-8699-0ad29962b340-catalog-content\") pod \"redhat-operators-xzf5k\" (UID: \"5ff6fc42-ccc4-4d32-8699-0ad29962b340\") " pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.101959 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjmv8\" (UniqueName: \"kubernetes.io/projected/5ff6fc42-ccc4-4d32-8699-0ad29962b340-kube-api-access-fjmv8\") pod \"redhat-operators-xzf5k\" (UID: \"5ff6fc42-ccc4-4d32-8699-0ad29962b340\") " pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.102001 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff6fc42-ccc4-4d32-8699-0ad29962b340-utilities\") pod \"redhat-operators-xzf5k\" (UID: \"5ff6fc42-ccc4-4d32-8699-0ad29962b340\") " pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.207211 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff6fc42-ccc4-4d32-8699-0ad29962b340-catalog-content\") pod \"redhat-operators-xzf5k\" (UID: \"5ff6fc42-ccc4-4d32-8699-0ad29962b340\") " pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.207266 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjmv8\" (UniqueName: \"kubernetes.io/projected/5ff6fc42-ccc4-4d32-8699-0ad29962b340-kube-api-access-fjmv8\") pod \"redhat-operators-xzf5k\" (UID: \"5ff6fc42-ccc4-4d32-8699-0ad29962b340\") " pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.207292 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff6fc42-ccc4-4d32-8699-0ad29962b340-utilities\") pod \"redhat-operators-xzf5k\" (UID: \"5ff6fc42-ccc4-4d32-8699-0ad29962b340\") " pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.207774 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff6fc42-ccc4-4d32-8699-0ad29962b340-utilities\") pod \"redhat-operators-xzf5k\" (UID: \"5ff6fc42-ccc4-4d32-8699-0ad29962b340\") " pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.207967 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff6fc42-ccc4-4d32-8699-0ad29962b340-catalog-content\") pod \"redhat-operators-xzf5k\" (UID: \"5ff6fc42-ccc4-4d32-8699-0ad29962b340\") " pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.235482 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjmv8\" (UniqueName: \"kubernetes.io/projected/5ff6fc42-ccc4-4d32-8699-0ad29962b340-kube-api-access-fjmv8\") pod \"redhat-operators-xzf5k\" (UID: \"5ff6fc42-ccc4-4d32-8699-0ad29962b340\") " pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.281954 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.474761 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xzf5k"] Sep 29 18:47:42 crc kubenswrapper[4780]: W0929 18:47:42.490229 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff6fc42_ccc4_4d32_8699_0ad29962b340.slice/crio-29b97ce7a6fa82a3e5e56ff6ccd13508857bf3b52538b44d236c9d1d427d292a WatchSource:0}: Error finding container 29b97ce7a6fa82a3e5e56ff6ccd13508857bf3b52538b44d236c9d1d427d292a: Status 404 returned error can't find the container with id 29b97ce7a6fa82a3e5e56ff6ccd13508857bf3b52538b44d236c9d1d427d292a Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.978380 4780 generic.go:334] "Generic (PLEG): container finished" podID="5ff6fc42-ccc4-4d32-8699-0ad29962b340" containerID="3fbde1d47c8a9c3a4f096f8b1315b478ef8cb57c592ca2f7e71f2e5ea403b137" exitCode=0 Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.978460 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xzf5k" event={"ID":"5ff6fc42-ccc4-4d32-8699-0ad29962b340","Type":"ContainerDied","Data":"3fbde1d47c8a9c3a4f096f8b1315b478ef8cb57c592ca2f7e71f2e5ea403b137"} Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.978805 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xzf5k" event={"ID":"5ff6fc42-ccc4-4d32-8699-0ad29962b340","Type":"ContainerStarted","Data":"29b97ce7a6fa82a3e5e56ff6ccd13508857bf3b52538b44d236c9d1d427d292a"} Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.983230 4780 generic.go:334] "Generic (PLEG): container finished" podID="b0dc84e5-abe0-4e53-813c-0363cf9de12f" containerID="d3bed422389446657fe1a22ad238a3e226a2965f8e017b6ec52f5271fbcc64eb" exitCode=0 Sep 29 18:47:42 crc kubenswrapper[4780]: I0929 18:47:42.983306 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68x8k" event={"ID":"b0dc84e5-abe0-4e53-813c-0363cf9de12f","Type":"ContainerDied","Data":"d3bed422389446657fe1a22ad238a3e226a2965f8e017b6ec52f5271fbcc64eb"} Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.739797 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sfstc"] Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.743387 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.745854 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.752905 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfstc"] Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.837646 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcvrp\" (UniqueName: \"kubernetes.io/projected/bc2a78c6-628f-489f-aa89-435224f9ef3e-kube-api-access-qcvrp\") pod \"certified-operators-sfstc\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.837706 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-utilities\") pod \"certified-operators-sfstc\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.837735 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-catalog-content\") pod \"certified-operators-sfstc\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.939381 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcvrp\" (UniqueName: \"kubernetes.io/projected/bc2a78c6-628f-489f-aa89-435224f9ef3e-kube-api-access-qcvrp\") pod \"certified-operators-sfstc\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.939451 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-utilities\") pod \"certified-operators-sfstc\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.939475 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-catalog-content\") pod \"certified-operators-sfstc\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.939978 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-catalog-content\") pod \"certified-operators-sfstc\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.940277 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-utilities\") pod \"certified-operators-sfstc\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.965662 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcvrp\" (UniqueName: \"kubernetes.io/projected/bc2a78c6-628f-489f-aa89-435224f9ef3e-kube-api-access-qcvrp\") pod \"certified-operators-sfstc\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:43 crc kubenswrapper[4780]: I0929 18:47:43.989844 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68x8k" event={"ID":"b0dc84e5-abe0-4e53-813c-0363cf9de12f","Type":"ContainerStarted","Data":"05fe320ef6a8fb2f51bfa81fe1bdf4e7c32e79dd8cda4facd2e68e1c0dd20faa"} Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.067089 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.320716 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6bd54"] Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.322346 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.326898 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.335310 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bd54"] Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.447371 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5059373a-528f-485b-afbe-2bd945289b0b-catalog-content\") pod \"community-operators-6bd54\" (UID: \"5059373a-528f-485b-afbe-2bd945289b0b\") " pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.447455 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txz4r\" (UniqueName: \"kubernetes.io/projected/5059373a-528f-485b-afbe-2bd945289b0b-kube-api-access-txz4r\") pod \"community-operators-6bd54\" (UID: \"5059373a-528f-485b-afbe-2bd945289b0b\") " pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.447865 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5059373a-528f-485b-afbe-2bd945289b0b-utilities\") pod \"community-operators-6bd54\" (UID: \"5059373a-528f-485b-afbe-2bd945289b0b\") " pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.510771 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfstc"] Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.550204 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5059373a-528f-485b-afbe-2bd945289b0b-utilities\") pod \"community-operators-6bd54\" (UID: \"5059373a-528f-485b-afbe-2bd945289b0b\") " pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.550534 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5059373a-528f-485b-afbe-2bd945289b0b-catalog-content\") pod \"community-operators-6bd54\" (UID: \"5059373a-528f-485b-afbe-2bd945289b0b\") " pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.550583 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txz4r\" (UniqueName: \"kubernetes.io/projected/5059373a-528f-485b-afbe-2bd945289b0b-kube-api-access-txz4r\") pod \"community-operators-6bd54\" (UID: \"5059373a-528f-485b-afbe-2bd945289b0b\") " pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.550998 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5059373a-528f-485b-afbe-2bd945289b0b-utilities\") pod \"community-operators-6bd54\" (UID: \"5059373a-528f-485b-afbe-2bd945289b0b\") " pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.551281 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5059373a-528f-485b-afbe-2bd945289b0b-catalog-content\") pod \"community-operators-6bd54\" (UID: \"5059373a-528f-485b-afbe-2bd945289b0b\") " pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.579191 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txz4r\" (UniqueName: \"kubernetes.io/projected/5059373a-528f-485b-afbe-2bd945289b0b-kube-api-access-txz4r\") pod \"community-operators-6bd54\" (UID: \"5059373a-528f-485b-afbe-2bd945289b0b\") " pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.647341 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.895322 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bd54"] Sep 29 18:47:44 crc kubenswrapper[4780]: W0929 18:47:44.905561 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5059373a_528f_485b_afbe_2bd945289b0b.slice/crio-25b59a1e81f21142f5936e6d2b631f1f891535551b22c92296c54189820791d2 WatchSource:0}: Error finding container 25b59a1e81f21142f5936e6d2b631f1f891535551b22c92296c54189820791d2: Status 404 returned error can't find the container with id 25b59a1e81f21142f5936e6d2b631f1f891535551b22c92296c54189820791d2 Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.997981 4780 generic.go:334] "Generic (PLEG): container finished" podID="b0dc84e5-abe0-4e53-813c-0363cf9de12f" containerID="05fe320ef6a8fb2f51bfa81fe1bdf4e7c32e79dd8cda4facd2e68e1c0dd20faa" exitCode=0 Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.998105 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68x8k" event={"ID":"b0dc84e5-abe0-4e53-813c-0363cf9de12f","Type":"ContainerDied","Data":"05fe320ef6a8fb2f51bfa81fe1bdf4e7c32e79dd8cda4facd2e68e1c0dd20faa"} Sep 29 18:47:44 crc kubenswrapper[4780]: I0929 18:47:44.998139 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68x8k" event={"ID":"b0dc84e5-abe0-4e53-813c-0363cf9de12f","Type":"ContainerStarted","Data":"cadd3b20537099e1ed8c2c49444d4edb25213568a0b1bd066935d6830a281bb5"} Sep 29 18:47:45 crc kubenswrapper[4780]: I0929 18:47:45.002557 4780 generic.go:334] "Generic (PLEG): container finished" podID="5ff6fc42-ccc4-4d32-8699-0ad29962b340" containerID="5479f9ea91a8bf9e1d9a074b13b32d5b562f60645c9bbfa813afab9e7af82ae9" exitCode=0 Sep 29 18:47:45 crc kubenswrapper[4780]: I0929 18:47:45.002638 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xzf5k" event={"ID":"5ff6fc42-ccc4-4d32-8699-0ad29962b340","Type":"ContainerDied","Data":"5479f9ea91a8bf9e1d9a074b13b32d5b562f60645c9bbfa813afab9e7af82ae9"} Sep 29 18:47:45 crc kubenswrapper[4780]: I0929 18:47:45.005016 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bd54" event={"ID":"5059373a-528f-485b-afbe-2bd945289b0b","Type":"ContainerStarted","Data":"25b59a1e81f21142f5936e6d2b631f1f891535551b22c92296c54189820791d2"} Sep 29 18:47:45 crc kubenswrapper[4780]: I0929 18:47:45.009665 4780 generic.go:334] "Generic (PLEG): container finished" podID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerID="402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e" exitCode=0 Sep 29 18:47:45 crc kubenswrapper[4780]: I0929 18:47:45.009739 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfstc" event={"ID":"bc2a78c6-628f-489f-aa89-435224f9ef3e","Type":"ContainerDied","Data":"402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e"} Sep 29 18:47:45 crc kubenswrapper[4780]: I0929 18:47:45.010285 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfstc" event={"ID":"bc2a78c6-628f-489f-aa89-435224f9ef3e","Type":"ContainerStarted","Data":"8a1104bfd0c2d9ab5d87200dfc5152e8eda157699ce5d06c2451915e728f0746"} Sep 29 18:47:45 crc kubenswrapper[4780]: I0929 18:47:45.026866 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-68x8k" podStartSLOduration=2.620856929 podStartE2EDuration="4.026838668s" podCreationTimestamp="2025-09-29 18:47:41 +0000 UTC" firstStartedPulling="2025-09-29 18:47:42.985677762 +0000 UTC m=+262.933975806" lastFinishedPulling="2025-09-29 18:47:44.391659501 +0000 UTC m=+264.339957545" observedRunningTime="2025-09-29 18:47:45.025833447 +0000 UTC m=+264.974131491" watchObservedRunningTime="2025-09-29 18:47:45.026838668 +0000 UTC m=+264.975136712" Sep 29 18:47:46 crc kubenswrapper[4780]: I0929 18:47:46.021104 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xzf5k" event={"ID":"5ff6fc42-ccc4-4d32-8699-0ad29962b340","Type":"ContainerStarted","Data":"eaf22af2764eaa1f7ebecc9637ed8c545a6d03c8b225d6f33469eccd8eb679b3"} Sep 29 18:47:46 crc kubenswrapper[4780]: I0929 18:47:46.022888 4780 generic.go:334] "Generic (PLEG): container finished" podID="5059373a-528f-485b-afbe-2bd945289b0b" containerID="6f3710bc712980860edaed89d2ecdd6ac264d5b90efa82378b47b241012aec09" exitCode=0 Sep 29 18:47:46 crc kubenswrapper[4780]: I0929 18:47:46.023419 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bd54" event={"ID":"5059373a-528f-485b-afbe-2bd945289b0b","Type":"ContainerDied","Data":"6f3710bc712980860edaed89d2ecdd6ac264d5b90efa82378b47b241012aec09"} Sep 29 18:47:46 crc kubenswrapper[4780]: I0929 18:47:46.047026 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xzf5k" podStartSLOduration=2.663223666 podStartE2EDuration="5.047007104s" podCreationTimestamp="2025-09-29 18:47:41 +0000 UTC" firstStartedPulling="2025-09-29 18:47:42.982568326 +0000 UTC m=+262.930866410" lastFinishedPulling="2025-09-29 18:47:45.366351804 +0000 UTC m=+265.314649848" observedRunningTime="2025-09-29 18:47:46.044244489 +0000 UTC m=+265.992542543" watchObservedRunningTime="2025-09-29 18:47:46.047007104 +0000 UTC m=+265.995305148" Sep 29 18:47:47 crc kubenswrapper[4780]: I0929 18:47:47.032203 4780 generic.go:334] "Generic (PLEG): container finished" podID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerID="bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352" exitCode=0 Sep 29 18:47:47 crc kubenswrapper[4780]: I0929 18:47:47.032269 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfstc" event={"ID":"bc2a78c6-628f-489f-aa89-435224f9ef3e","Type":"ContainerDied","Data":"bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352"} Sep 29 18:47:49 crc kubenswrapper[4780]: I0929 18:47:49.047219 4780 generic.go:334] "Generic (PLEG): container finished" podID="5059373a-528f-485b-afbe-2bd945289b0b" containerID="922356983e4a6c7c363df810df13650cd90783b9ef9b4e69fd9760e3d5862604" exitCode=0 Sep 29 18:47:49 crc kubenswrapper[4780]: I0929 18:47:49.047343 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bd54" event={"ID":"5059373a-528f-485b-afbe-2bd945289b0b","Type":"ContainerDied","Data":"922356983e4a6c7c363df810df13650cd90783b9ef9b4e69fd9760e3d5862604"} Sep 29 18:47:49 crc kubenswrapper[4780]: I0929 18:47:49.054316 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfstc" event={"ID":"bc2a78c6-628f-489f-aa89-435224f9ef3e","Type":"ContainerStarted","Data":"2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833"} Sep 29 18:47:49 crc kubenswrapper[4780]: I0929 18:47:49.092137 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sfstc" podStartSLOduration=3.590190522 podStartE2EDuration="6.092114604s" podCreationTimestamp="2025-09-29 18:47:43 +0000 UTC" firstStartedPulling="2025-09-29 18:47:45.011856566 +0000 UTC m=+264.960154610" lastFinishedPulling="2025-09-29 18:47:47.513780628 +0000 UTC m=+267.462078692" observedRunningTime="2025-09-29 18:47:49.09197843 +0000 UTC m=+269.040276474" watchObservedRunningTime="2025-09-29 18:47:49.092114604 +0000 UTC m=+269.040412648" Sep 29 18:47:50 crc kubenswrapper[4780]: I0929 18:47:50.078096 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bd54" event={"ID":"5059373a-528f-485b-afbe-2bd945289b0b","Type":"ContainerStarted","Data":"3405e60b65bcc7cabee4ee10a08036e09ef337df009719a63ea575adcb512873"} Sep 29 18:47:50 crc kubenswrapper[4780]: I0929 18:47:50.098264 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6bd54" podStartSLOduration=2.66598592 podStartE2EDuration="6.098248096s" podCreationTimestamp="2025-09-29 18:47:44 +0000 UTC" firstStartedPulling="2025-09-29 18:47:46.024684205 +0000 UTC m=+265.972982269" lastFinishedPulling="2025-09-29 18:47:49.456946401 +0000 UTC m=+269.405244445" observedRunningTime="2025-09-29 18:47:50.095901033 +0000 UTC m=+270.044199077" watchObservedRunningTime="2025-09-29 18:47:50.098248096 +0000 UTC m=+270.046546130" Sep 29 18:47:51 crc kubenswrapper[4780]: I0929 18:47:51.638870 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:51 crc kubenswrapper[4780]: I0929 18:47:51.639406 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:51 crc kubenswrapper[4780]: I0929 18:47:51.693026 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:52 crc kubenswrapper[4780]: I0929 18:47:52.135570 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-68x8k" Sep 29 18:47:52 crc kubenswrapper[4780]: I0929 18:47:52.282330 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:52 crc kubenswrapper[4780]: I0929 18:47:52.283075 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:52 crc kubenswrapper[4780]: I0929 18:47:52.321816 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:53 crc kubenswrapper[4780]: I0929 18:47:53.137723 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xzf5k" Sep 29 18:47:54 crc kubenswrapper[4780]: I0929 18:47:54.067899 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:54 crc kubenswrapper[4780]: I0929 18:47:54.067986 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:54 crc kubenswrapper[4780]: I0929 18:47:54.117383 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:54 crc kubenswrapper[4780]: I0929 18:47:54.167808 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sfstc" Sep 29 18:47:54 crc kubenswrapper[4780]: I0929 18:47:54.647994 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:54 crc kubenswrapper[4780]: I0929 18:47:54.648113 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:54 crc kubenswrapper[4780]: I0929 18:47:54.720291 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:47:55 crc kubenswrapper[4780]: I0929 18:47:55.155120 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6bd54" Sep 29 18:49:03 crc kubenswrapper[4780]: I0929 18:49:03.223290 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:49:03 crc kubenswrapper[4780]: I0929 18:49:03.225271 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:49:33 crc kubenswrapper[4780]: I0929 18:49:33.223479 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:49:33 crc kubenswrapper[4780]: I0929 18:49:33.224311 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:50:03 crc kubenswrapper[4780]: I0929 18:50:03.223539 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:50:03 crc kubenswrapper[4780]: I0929 18:50:03.224646 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:50:03 crc kubenswrapper[4780]: I0929 18:50:03.224718 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:50:03 crc kubenswrapper[4780]: I0929 18:50:03.225762 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c05a7039d50100743fdb64b2263f4cfb1bbcf6e2c50d90eb29f232397ee36e3b"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 18:50:03 crc kubenswrapper[4780]: I0929 18:50:03.225874 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://c05a7039d50100743fdb64b2263f4cfb1bbcf6e2c50d90eb29f232397ee36e3b" gracePeriod=600 Sep 29 18:50:03 crc kubenswrapper[4780]: I0929 18:50:03.969415 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="c05a7039d50100743fdb64b2263f4cfb1bbcf6e2c50d90eb29f232397ee36e3b" exitCode=0 Sep 29 18:50:03 crc kubenswrapper[4780]: I0929 18:50:03.969489 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"c05a7039d50100743fdb64b2263f4cfb1bbcf6e2c50d90eb29f232397ee36e3b"} Sep 29 18:50:03 crc kubenswrapper[4780]: I0929 18:50:03.969998 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"69a2b52b89db4fdab7624cb8dbf5c5bc56e09914aff584a5c943513fc85a4122"} Sep 29 18:50:03 crc kubenswrapper[4780]: I0929 18:50:03.970094 4780 scope.go:117] "RemoveContainer" containerID="889e4104b7f1baa7b6d29283d6134ac4186866c4fc38c5869a61fc824baac5b5" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.497615 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sd7fk"] Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.499976 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.515484 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sd7fk"] Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.526489 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f6a5637-809e-4391-b526-235e93c7992b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.526583 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f6a5637-809e-4391-b526-235e93c7992b-trusted-ca\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.526607 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f6a5637-809e-4391-b526-235e93c7992b-registry-tls\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.526629 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f6a5637-809e-4391-b526-235e93c7992b-bound-sa-token\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.526651 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f6a5637-809e-4391-b526-235e93c7992b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.526706 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.526755 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldp2n\" (UniqueName: \"kubernetes.io/projected/5f6a5637-809e-4391-b526-235e93c7992b-kube-api-access-ldp2n\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.526791 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f6a5637-809e-4391-b526-235e93c7992b-registry-certificates\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.576530 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.627535 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f6a5637-809e-4391-b526-235e93c7992b-trusted-ca\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.627883 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f6a5637-809e-4391-b526-235e93c7992b-registry-tls\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.627959 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f6a5637-809e-4391-b526-235e93c7992b-bound-sa-token\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.628220 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f6a5637-809e-4391-b526-235e93c7992b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.628324 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldp2n\" (UniqueName: \"kubernetes.io/projected/5f6a5637-809e-4391-b526-235e93c7992b-kube-api-access-ldp2n\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.628399 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f6a5637-809e-4391-b526-235e93c7992b-registry-certificates\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.628463 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f6a5637-809e-4391-b526-235e93c7992b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.629068 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5f6a5637-809e-4391-b526-235e93c7992b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.629603 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f6a5637-809e-4391-b526-235e93c7992b-trusted-ca\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.630280 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5f6a5637-809e-4391-b526-235e93c7992b-registry-certificates\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.635983 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5f6a5637-809e-4391-b526-235e93c7992b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.638439 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5f6a5637-809e-4391-b526-235e93c7992b-registry-tls\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.645744 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f6a5637-809e-4391-b526-235e93c7992b-bound-sa-token\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.645986 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldp2n\" (UniqueName: \"kubernetes.io/projected/5f6a5637-809e-4391-b526-235e93c7992b-kube-api-access-ldp2n\") pod \"image-registry-66df7c8f76-sd7fk\" (UID: \"5f6a5637-809e-4391-b526-235e93c7992b\") " pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:06 crc kubenswrapper[4780]: I0929 18:51:06.819155 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:07 crc kubenswrapper[4780]: I0929 18:51:07.059865 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sd7fk"] Sep 29 18:51:07 crc kubenswrapper[4780]: I0929 18:51:07.388805 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" event={"ID":"5f6a5637-809e-4391-b526-235e93c7992b","Type":"ContainerStarted","Data":"0ddd94d359449234397a0df7d326284d4572d6edd985a5af25f7dda91127e0ee"} Sep 29 18:51:07 crc kubenswrapper[4780]: I0929 18:51:07.390129 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" event={"ID":"5f6a5637-809e-4391-b526-235e93c7992b","Type":"ContainerStarted","Data":"4f817f2461fceea0d1b3b05a801d061573a2a55d421a3f9efec996e020ab68ad"} Sep 29 18:51:07 crc kubenswrapper[4780]: I0929 18:51:07.390175 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:07 crc kubenswrapper[4780]: I0929 18:51:07.410401 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" podStartSLOduration=1.4103715989999999 podStartE2EDuration="1.410371599s" podCreationTimestamp="2025-09-29 18:51:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:51:07.40942683 +0000 UTC m=+467.357724894" watchObservedRunningTime="2025-09-29 18:51:07.410371599 +0000 UTC m=+467.358669643" Sep 29 18:51:26 crc kubenswrapper[4780]: I0929 18:51:26.834429 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-sd7fk" Sep 29 18:51:26 crc kubenswrapper[4780]: I0929 18:51:26.900872 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8c67f"] Sep 29 18:51:51 crc kubenswrapper[4780]: I0929 18:51:51.950669 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" podUID="0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" containerName="registry" containerID="cri-o://4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785" gracePeriod=30 Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.309750 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.467383 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-certificates\") pod \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.467887 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjt4w\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-kube-api-access-cjt4w\") pod \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.467966 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-installation-pull-secrets\") pod \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.468028 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-trusted-ca\") pod \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.468244 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.468424 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-tls\") pod \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.468512 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-ca-trust-extracted\") pod \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.468640 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-bound-sa-token\") pod \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\" (UID: \"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6\") " Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.469197 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.469253 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.475959 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-kube-api-access-cjt4w" (OuterVolumeSpecName: "kube-api-access-cjt4w") pod "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6"). InnerVolumeSpecName "kube-api-access-cjt4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.476571 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.479060 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.479098 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.479288 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.496796 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" (UID: "0f3eb31f-21dd-4c76-bfb3-102fb999b7c6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.569920 4780 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.569958 4780 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.569972 4780 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.569984 4780 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.570000 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjt4w\" (UniqueName: \"kubernetes.io/projected/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-kube-api-access-cjt4w\") on node \"crc\" DevicePath \"\"" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.570011 4780 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.570021 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.758209 4780 generic.go:334] "Generic (PLEG): container finished" podID="0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" containerID="4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785" exitCode=0 Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.758364 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.765439 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" event={"ID":"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6","Type":"ContainerDied","Data":"4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785"} Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.765511 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8c67f" event={"ID":"0f3eb31f-21dd-4c76-bfb3-102fb999b7c6","Type":"ContainerDied","Data":"3b1c281ed7acf7ccf0534cd9a07bbbd6d9000f84f183ee71678f75110294d37b"} Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.765553 4780 scope.go:117] "RemoveContainer" containerID="4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.786013 4780 scope.go:117] "RemoveContainer" containerID="4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785" Sep 29 18:51:52 crc kubenswrapper[4780]: E0929 18:51:52.786634 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785\": container with ID starting with 4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785 not found: ID does not exist" containerID="4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.786681 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785"} err="failed to get container status \"4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785\": rpc error: code = NotFound desc = could not find container \"4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785\": container with ID starting with 4bebced84c2f9441a684518e97b48248a6ae5ff094c5a8002a2e512b3fa06785 not found: ID does not exist" Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.815343 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8c67f"] Sep 29 18:51:52 crc kubenswrapper[4780]: I0929 18:51:52.823711 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8c67f"] Sep 29 18:51:54 crc kubenswrapper[4780]: I0929 18:51:54.765162 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" path="/var/lib/kubelet/pods/0f3eb31f-21dd-4c76-bfb3-102fb999b7c6/volumes" Sep 29 18:52:03 crc kubenswrapper[4780]: I0929 18:52:03.223716 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:52:03 crc kubenswrapper[4780]: I0929 18:52:03.224417 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:52:33 crc kubenswrapper[4780]: I0929 18:52:33.223598 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:52:33 crc kubenswrapper[4780]: I0929 18:52:33.224178 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:53:03 crc kubenswrapper[4780]: I0929 18:53:03.223834 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:53:03 crc kubenswrapper[4780]: I0929 18:53:03.224610 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:53:03 crc kubenswrapper[4780]: I0929 18:53:03.224686 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:53:03 crc kubenswrapper[4780]: I0929 18:53:03.225622 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69a2b52b89db4fdab7624cb8dbf5c5bc56e09914aff584a5c943513fc85a4122"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 18:53:03 crc kubenswrapper[4780]: I0929 18:53:03.225714 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://69a2b52b89db4fdab7624cb8dbf5c5bc56e09914aff584a5c943513fc85a4122" gracePeriod=600 Sep 29 18:53:04 crc kubenswrapper[4780]: I0929 18:53:04.223376 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="69a2b52b89db4fdab7624cb8dbf5c5bc56e09914aff584a5c943513fc85a4122" exitCode=0 Sep 29 18:53:04 crc kubenswrapper[4780]: I0929 18:53:04.223447 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"69a2b52b89db4fdab7624cb8dbf5c5bc56e09914aff584a5c943513fc85a4122"} Sep 29 18:53:04 crc kubenswrapper[4780]: I0929 18:53:04.223983 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"f98e9b0b044c5602c6337b54b65a812f8f898d93f6aec3d809843fc6e333379d"} Sep 29 18:53:04 crc kubenswrapper[4780]: I0929 18:53:04.224030 4780 scope.go:117] "RemoveContainer" containerID="c05a7039d50100743fdb64b2263f4cfb1bbcf6e2c50d90eb29f232397ee36e3b" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.043332 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jxt77"] Sep 29 18:54:27 crc kubenswrapper[4780]: E0929 18:54:27.044689 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" containerName="registry" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.044718 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" containerName="registry" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.044916 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3eb31f-21dd-4c76-bfb3-102fb999b7c6" containerName="registry" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.049884 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.053157 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jxt77"] Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.055684 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.055688 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.056215 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.058287 4780 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-76nlq" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.156560 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ed5c19d4-0266-4678-91ba-a446f91369b1-crc-storage\") pod \"crc-storage-crc-jxt77\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.156627 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ed5c19d4-0266-4678-91ba-a446f91369b1-node-mnt\") pod \"crc-storage-crc-jxt77\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.156678 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq4bl\" (UniqueName: \"kubernetes.io/projected/ed5c19d4-0266-4678-91ba-a446f91369b1-kube-api-access-tq4bl\") pod \"crc-storage-crc-jxt77\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.258651 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq4bl\" (UniqueName: \"kubernetes.io/projected/ed5c19d4-0266-4678-91ba-a446f91369b1-kube-api-access-tq4bl\") pod \"crc-storage-crc-jxt77\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.258792 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ed5c19d4-0266-4678-91ba-a446f91369b1-crc-storage\") pod \"crc-storage-crc-jxt77\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.258855 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ed5c19d4-0266-4678-91ba-a446f91369b1-node-mnt\") pod \"crc-storage-crc-jxt77\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.259271 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ed5c19d4-0266-4678-91ba-a446f91369b1-node-mnt\") pod \"crc-storage-crc-jxt77\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.259671 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ed5c19d4-0266-4678-91ba-a446f91369b1-crc-storage\") pod \"crc-storage-crc-jxt77\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.284371 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq4bl\" (UniqueName: \"kubernetes.io/projected/ed5c19d4-0266-4678-91ba-a446f91369b1-kube-api-access-tq4bl\") pod \"crc-storage-crc-jxt77\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.378127 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.593225 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jxt77"] Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.600974 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 18:54:27 crc kubenswrapper[4780]: I0929 18:54:27.838720 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jxt77" event={"ID":"ed5c19d4-0266-4678-91ba-a446f91369b1","Type":"ContainerStarted","Data":"67e279c204dd8aaf7d5182be072babc34a935fbff79ee7c27a121942ecee227b"} Sep 29 18:54:29 crc kubenswrapper[4780]: I0929 18:54:29.854301 4780 generic.go:334] "Generic (PLEG): container finished" podID="ed5c19d4-0266-4678-91ba-a446f91369b1" containerID="36fca5ee83fc7460f10b3faa2dcf636fccff1bc70b19ff11a2a1ce4bfb2e92b9" exitCode=0 Sep 29 18:54:29 crc kubenswrapper[4780]: I0929 18:54:29.854742 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jxt77" event={"ID":"ed5c19d4-0266-4678-91ba-a446f91369b1","Type":"ContainerDied","Data":"36fca5ee83fc7460f10b3faa2dcf636fccff1bc70b19ff11a2a1ce4bfb2e92b9"} Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.135350 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.218681 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq4bl\" (UniqueName: \"kubernetes.io/projected/ed5c19d4-0266-4678-91ba-a446f91369b1-kube-api-access-tq4bl\") pod \"ed5c19d4-0266-4678-91ba-a446f91369b1\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.218794 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ed5c19d4-0266-4678-91ba-a446f91369b1-node-mnt\") pod \"ed5c19d4-0266-4678-91ba-a446f91369b1\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.218851 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ed5c19d4-0266-4678-91ba-a446f91369b1-crc-storage\") pod \"ed5c19d4-0266-4678-91ba-a446f91369b1\" (UID: \"ed5c19d4-0266-4678-91ba-a446f91369b1\") " Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.219014 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed5c19d4-0266-4678-91ba-a446f91369b1-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "ed5c19d4-0266-4678-91ba-a446f91369b1" (UID: "ed5c19d4-0266-4678-91ba-a446f91369b1"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.227329 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed5c19d4-0266-4678-91ba-a446f91369b1-kube-api-access-tq4bl" (OuterVolumeSpecName: "kube-api-access-tq4bl") pod "ed5c19d4-0266-4678-91ba-a446f91369b1" (UID: "ed5c19d4-0266-4678-91ba-a446f91369b1"). InnerVolumeSpecName "kube-api-access-tq4bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.236389 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed5c19d4-0266-4678-91ba-a446f91369b1-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "ed5c19d4-0266-4678-91ba-a446f91369b1" (UID: "ed5c19d4-0266-4678-91ba-a446f91369b1"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.320096 4780 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ed5c19d4-0266-4678-91ba-a446f91369b1-node-mnt\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.320156 4780 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ed5c19d4-0266-4678-91ba-a446f91369b1-crc-storage\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.320172 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq4bl\" (UniqueName: \"kubernetes.io/projected/ed5c19d4-0266-4678-91ba-a446f91369b1-kube-api-access-tq4bl\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.868893 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jxt77" event={"ID":"ed5c19d4-0266-4678-91ba-a446f91369b1","Type":"ContainerDied","Data":"67e279c204dd8aaf7d5182be072babc34a935fbff79ee7c27a121942ecee227b"} Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.868955 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67e279c204dd8aaf7d5182be072babc34a935fbff79ee7c27a121942ecee227b" Sep 29 18:54:31 crc kubenswrapper[4780]: I0929 18:54:31.868997 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jxt77" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.694209 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr"] Sep 29 18:54:37 crc kubenswrapper[4780]: E0929 18:54:37.695421 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5c19d4-0266-4678-91ba-a446f91369b1" containerName="storage" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.695436 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5c19d4-0266-4678-91ba-a446f91369b1" containerName="storage" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.695626 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed5c19d4-0266-4678-91ba-a446f91369b1" containerName="storage" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.696558 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.701619 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.704947 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr"] Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.716795 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f42lg\" (UniqueName: \"kubernetes.io/projected/eca29428-21f3-4e84-81b2-a42e23d90c23-kube-api-access-f42lg\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.716918 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.716997 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.818144 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.818254 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.818289 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f42lg\" (UniqueName: \"kubernetes.io/projected/eca29428-21f3-4e84-81b2-a42e23d90c23-kube-api-access-f42lg\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.818839 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.818946 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:37 crc kubenswrapper[4780]: I0929 18:54:37.838727 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f42lg\" (UniqueName: \"kubernetes.io/projected/eca29428-21f3-4e84-81b2-a42e23d90c23-kube-api-access-f42lg\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:38 crc kubenswrapper[4780]: I0929 18:54:38.020091 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:38 crc kubenswrapper[4780]: I0929 18:54:38.220937 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr"] Sep 29 18:54:38 crc kubenswrapper[4780]: I0929 18:54:38.914126 4780 generic.go:334] "Generic (PLEG): container finished" podID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerID="e579b3e81bf23b437d7e8b97f5f37a51b609856f010fa40109d9c31b49f9c3c6" exitCode=0 Sep 29 18:54:38 crc kubenswrapper[4780]: I0929 18:54:38.914198 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" event={"ID":"eca29428-21f3-4e84-81b2-a42e23d90c23","Type":"ContainerDied","Data":"e579b3e81bf23b437d7e8b97f5f37a51b609856f010fa40109d9c31b49f9c3c6"} Sep 29 18:54:38 crc kubenswrapper[4780]: I0929 18:54:38.914251 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" event={"ID":"eca29428-21f3-4e84-81b2-a42e23d90c23","Type":"ContainerStarted","Data":"798578fed354cefed306e444e00d71949d2ee9d66308d75ed8d6d6d7078d9a32"} Sep 29 18:54:40 crc kubenswrapper[4780]: I0929 18:54:40.927223 4780 generic.go:334] "Generic (PLEG): container finished" podID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerID="b41845f4875efdd116ecbc44ec8b7349500a11b6d33d68f89ac39fbc89397c7b" exitCode=0 Sep 29 18:54:40 crc kubenswrapper[4780]: I0929 18:54:40.927307 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" event={"ID":"eca29428-21f3-4e84-81b2-a42e23d90c23","Type":"ContainerDied","Data":"b41845f4875efdd116ecbc44ec8b7349500a11b6d33d68f89ac39fbc89397c7b"} Sep 29 18:54:41 crc kubenswrapper[4780]: I0929 18:54:41.939753 4780 generic.go:334] "Generic (PLEG): container finished" podID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerID="0e9bafdbcf613c48bfd8950beee113065f91ed6fce3c543329f62efac15b6ed2" exitCode=0 Sep 29 18:54:41 crc kubenswrapper[4780]: I0929 18:54:41.939850 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" event={"ID":"eca29428-21f3-4e84-81b2-a42e23d90c23","Type":"ContainerDied","Data":"0e9bafdbcf613c48bfd8950beee113065f91ed6fce3c543329f62efac15b6ed2"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.042686 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p7vtr"] Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.043215 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovn-controller" containerID="cri-o://32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549" gracePeriod=30 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.043664 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="sbdb" containerID="cri-o://5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632" gracePeriod=30 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.043736 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="nbdb" containerID="cri-o://b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa" gracePeriod=30 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.043782 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="northd" containerID="cri-o://4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5" gracePeriod=30 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.043825 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05" gracePeriod=30 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.043865 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kube-rbac-proxy-node" containerID="cri-o://c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe" gracePeriod=30 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.043911 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovn-acl-logging" containerID="cri-o://0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c" gracePeriod=30 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.112903 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" containerID="cri-o://c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f" gracePeriod=30 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.406236 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/3.log" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.408688 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovn-acl-logging/0.log" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.409342 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovn-controller/0.log" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.409841 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458179 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mf925"] Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458418 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="northd" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458432 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="northd" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458442 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="sbdb" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458448 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="sbdb" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458460 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovn-acl-logging" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458466 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovn-acl-logging" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458478 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458485 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458493 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458499 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458506 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kubecfg-setup" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458511 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kubecfg-setup" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458524 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovn-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458530 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovn-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458537 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458544 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458554 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kube-rbac-proxy-node" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458561 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kube-rbac-proxy-node" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458569 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="nbdb" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458575 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="nbdb" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458582 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458588 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458686 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458697 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458704 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="northd" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458711 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovn-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458720 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kube-rbac-proxy-node" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458728 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458735 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="sbdb" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458743 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="kube-rbac-proxy-ovn-metrics" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458750 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovn-acl-logging" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458758 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458767 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="nbdb" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458859 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458867 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.458878 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458884 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.458980 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" containerName="ovnkube-controller" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.460620 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.483832 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-systemd-units\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.483900 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-run-netns\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.483953 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-run-systemd\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.483979 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-var-lib-openvswitch\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.484548 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-slash\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.484616 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-kubelet\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.585769 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-netd\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.585823 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-ovn\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.585864 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-node-log\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.585908 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-var-lib-openvswitch\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.585950 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-systemd\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.585986 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-netns\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586010 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-ovn-kubernetes\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586027 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-etc-openvswitch\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586063 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-bin\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586092 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r2sf\" (UniqueName: \"kubernetes.io/projected/43a328df-2763-44f9-9512-3abb64ef45aa-kube-api-access-5r2sf\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586117 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-config\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586165 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-script-lib\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586193 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586218 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-log-socket\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586267 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-openvswitch\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586289 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-env-overrides\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586285 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586314 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43a328df-2763-44f9-9512-3abb64ef45aa-ovn-node-metrics-cert\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586396 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-kubelet\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586432 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-systemd-units\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586451 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-slash\") pod \"43a328df-2763-44f9-9512-3abb64ef45aa\" (UID: \"43a328df-2763-44f9-9512-3abb64ef45aa\") " Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586550 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-ovnkube-config\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586576 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpsxs\" (UniqueName: \"kubernetes.io/projected/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-kube-api-access-xpsxs\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586608 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-run-systemd\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586633 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-var-lib-openvswitch\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586660 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-run-openvswitch\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586678 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-cni-netd\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586698 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-slash\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586716 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-env-overrides\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586741 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-kubelet\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586779 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-cni-bin\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586805 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-systemd-units\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586825 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-ovnkube-script-lib\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586853 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586870 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-node-log\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586889 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-ovn-node-metrics-cert\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586912 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-run-netns\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586927 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-log-socket\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586945 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-run-ovn-kubernetes\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586967 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-etc-openvswitch\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.586990 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-run-ovn\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587022 4780 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587077 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587100 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587120 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-node-log" (OuterVolumeSpecName: "node-log") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587139 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587332 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587392 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587424 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-log-socket" (OuterVolumeSpecName: "log-socket") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587431 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587450 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587478 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587506 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-slash" (OuterVolumeSpecName: "host-slash") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587583 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-run-systemd\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587638 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-var-lib-openvswitch\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587692 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-slash\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587729 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-kubelet\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587763 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-systemd-units\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587793 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587813 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-run-netns\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587811 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587835 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.587860 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.588171 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.592970 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a328df-2763-44f9-9512-3abb64ef45aa-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.593234 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a328df-2763-44f9-9512-3abb64ef45aa-kube-api-access-5r2sf" (OuterVolumeSpecName: "kube-api-access-5r2sf") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "kube-api-access-5r2sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.600913 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "43a328df-2763-44f9-9512-3abb64ef45aa" (UID: "43a328df-2763-44f9-9512-3abb64ef45aa"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688314 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-cni-bin\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688397 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-ovnkube-script-lib\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688437 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688477 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-cni-bin\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688509 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-node-log\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688573 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-ovn-node-metrics-cert\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688605 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-log-socket\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688624 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-run-ovn-kubernetes\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688623 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688651 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-etc-openvswitch\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688675 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-etc-openvswitch\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688623 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-node-log\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688705 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-run-ovn\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688730 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-ovnkube-config\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688722 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-run-ovn-kubernetes\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688746 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsxs\" (UniqueName: \"kubernetes.io/projected/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-kube-api-access-xpsxs\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688805 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-run-ovn\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688860 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-run-openvswitch\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688873 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-log-socket\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688886 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-cni-netd\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688930 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-run-openvswitch\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688910 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-host-cni-netd\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.688960 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-env-overrides\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689122 4780 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-slash\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689156 4780 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689179 4780 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689202 4780 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-node-log\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689219 4780 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689236 4780 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689255 4780 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689273 4780 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689291 4780 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689308 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r2sf\" (UniqueName: \"kubernetes.io/projected/43a328df-2763-44f9-9512-3abb64ef45aa-kube-api-access-5r2sf\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689326 4780 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689342 4780 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689360 4780 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689377 4780 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-log-socket\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689395 4780 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43a328df-2763-44f9-9512-3abb64ef45aa-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689412 4780 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689430 4780 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43a328df-2763-44f9-9512-3abb64ef45aa-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689448 4780 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689465 4780 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43a328df-2763-44f9-9512-3abb64ef45aa-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689687 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-ovnkube-config\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.689689 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-env-overrides\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.690146 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-ovnkube-script-lib\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.694153 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-ovn-node-metrics-cert\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.705865 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpsxs\" (UniqueName: \"kubernetes.io/projected/411bf8ba-9d7a-45e5-8e6e-3e2645157dbe-kube-api-access-xpsxs\") pod \"ovnkube-node-mf925\" (UID: \"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe\") " pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.775421 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:42 crc kubenswrapper[4780]: W0929 18:54:42.809406 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod411bf8ba_9d7a_45e5_8e6e_3e2645157dbe.slice/crio-14a60aa7af5061e8d67038e27991083ff13aade59bf4e3b986d01bacaa423287 WatchSource:0}: Error finding container 14a60aa7af5061e8d67038e27991083ff13aade59bf4e3b986d01bacaa423287: Status 404 returned error can't find the container with id 14a60aa7af5061e8d67038e27991083ff13aade59bf4e3b986d01bacaa423287 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.951871 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovnkube-controller/3.log" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.955141 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovn-acl-logging/0.log" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.955853 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7vtr_43a328df-2763-44f9-9512-3abb64ef45aa/ovn-controller/0.log" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956263 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f" exitCode=0 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956308 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632" exitCode=0 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956326 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa" exitCode=0 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956343 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5" exitCode=0 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956360 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05" exitCode=0 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956373 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe" exitCode=0 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956386 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c" exitCode=143 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956400 4780 generic.go:334] "Generic (PLEG): container finished" podID="43a328df-2763-44f9-9512-3abb64ef45aa" containerID="32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549" exitCode=143 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956463 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956510 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956531 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956550 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956568 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956586 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956605 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956624 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956636 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956648 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956659 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956670 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956681 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956692 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956703 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956719 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956738 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956752 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956764 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956775 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956803 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956814 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956826 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956838 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956848 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956859 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956873 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956889 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956902 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956912 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956923 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956934 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956944 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956955 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956965 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956975 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.956985 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957000 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" event={"ID":"43a328df-2763-44f9-9512-3abb64ef45aa","Type":"ContainerDied","Data":"ff6aee3711ee63e8d82b287ad9d225eeb8ccfdc00388a17604d9341baa1e1692"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957015 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957028 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957039 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957084 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957100 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957113 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957123 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957134 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957144 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957154 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957181 4780 scope.go:117] "RemoveContainer" containerID="c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.957403 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p7vtr" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.960247 4780 generic.go:334] "Generic (PLEG): container finished" podID="411bf8ba-9d7a-45e5-8e6e-3e2645157dbe" containerID="0e08305c9b3ea724ba4cd5e63e939108147b9a78f381f7b37b4672ea6b4f33f8" exitCode=0 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.960365 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerDied","Data":"0e08305c9b3ea724ba4cd5e63e939108147b9a78f381f7b37b4672ea6b4f33f8"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.960443 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerStarted","Data":"14a60aa7af5061e8d67038e27991083ff13aade59bf4e3b986d01bacaa423287"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.964957 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/2.log" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.966241 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/1.log" Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.966311 4780 generic.go:334] "Generic (PLEG): container finished" podID="2c2af9fc-5cef-48e3-8070-cf2767bc4a81" containerID="9e72eed2874a3197d3024f6117b220b2d4dcab94b6f2a290f9d2866bd48d86fd" exitCode=2 Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.966703 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wc8rf" event={"ID":"2c2af9fc-5cef-48e3-8070-cf2767bc4a81","Type":"ContainerDied","Data":"9e72eed2874a3197d3024f6117b220b2d4dcab94b6f2a290f9d2866bd48d86fd"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.966773 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d"} Sep 29 18:54:42 crc kubenswrapper[4780]: I0929 18:54:42.968230 4780 scope.go:117] "RemoveContainer" containerID="9e72eed2874a3197d3024f6117b220b2d4dcab94b6f2a290f9d2866bd48d86fd" Sep 29 18:54:42 crc kubenswrapper[4780]: E0929 18:54:42.968643 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wc8rf_openshift-multus(2c2af9fc-5cef-48e3-8070-cf2767bc4a81)\"" pod="openshift-multus/multus-wc8rf" podUID="2c2af9fc-5cef-48e3-8070-cf2767bc4a81" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.008805 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.044161 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p7vtr"] Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.048954 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p7vtr"] Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.076340 4780 scope.go:117] "RemoveContainer" containerID="5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.116105 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.129661 4780 scope.go:117] "RemoveContainer" containerID="b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.152021 4780 scope.go:117] "RemoveContainer" containerID="4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.167870 4780 scope.go:117] "RemoveContainer" containerID="598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.191452 4780 scope.go:117] "RemoveContainer" containerID="c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.195446 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-bundle\") pod \"eca29428-21f3-4e84-81b2-a42e23d90c23\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.195543 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f42lg\" (UniqueName: \"kubernetes.io/projected/eca29428-21f3-4e84-81b2-a42e23d90c23-kube-api-access-f42lg\") pod \"eca29428-21f3-4e84-81b2-a42e23d90c23\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.195634 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-util\") pod \"eca29428-21f3-4e84-81b2-a42e23d90c23\" (UID: \"eca29428-21f3-4e84-81b2-a42e23d90c23\") " Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.196645 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-bundle" (OuterVolumeSpecName: "bundle") pod "eca29428-21f3-4e84-81b2-a42e23d90c23" (UID: "eca29428-21f3-4e84-81b2-a42e23d90c23"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.201608 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca29428-21f3-4e84-81b2-a42e23d90c23-kube-api-access-f42lg" (OuterVolumeSpecName: "kube-api-access-f42lg") pod "eca29428-21f3-4e84-81b2-a42e23d90c23" (UID: "eca29428-21f3-4e84-81b2-a42e23d90c23"). InnerVolumeSpecName "kube-api-access-f42lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.210016 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-util" (OuterVolumeSpecName: "util") pod "eca29428-21f3-4e84-81b2-a42e23d90c23" (UID: "eca29428-21f3-4e84-81b2-a42e23d90c23"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.247709 4780 scope.go:117] "RemoveContainer" containerID="0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.263261 4780 scope.go:117] "RemoveContainer" containerID="32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.288478 4780 scope.go:117] "RemoveContainer" containerID="e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.296522 4780 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.298160 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f42lg\" (UniqueName: \"kubernetes.io/projected/eca29428-21f3-4e84-81b2-a42e23d90c23-kube-api-access-f42lg\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.298191 4780 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eca29428-21f3-4e84-81b2-a42e23d90c23-util\") on node \"crc\" DevicePath \"\"" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.308279 4780 scope.go:117] "RemoveContainer" containerID="c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.308833 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": container with ID starting with c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f not found: ID does not exist" containerID="c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.308904 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} err="failed to get container status \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": rpc error: code = NotFound desc = could not find container \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": container with ID starting with c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.308943 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.309385 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\": container with ID starting with e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008 not found: ID does not exist" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.309427 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} err="failed to get container status \"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\": rpc error: code = NotFound desc = could not find container \"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\": container with ID starting with e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.309461 4780 scope.go:117] "RemoveContainer" containerID="5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.309698 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\": container with ID starting with 5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632 not found: ID does not exist" containerID="5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.309723 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} err="failed to get container status \"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\": rpc error: code = NotFound desc = could not find container \"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\": container with ID starting with 5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.309744 4780 scope.go:117] "RemoveContainer" containerID="b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.310239 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\": container with ID starting with b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa not found: ID does not exist" containerID="b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.310272 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} err="failed to get container status \"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\": rpc error: code = NotFound desc = could not find container \"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\": container with ID starting with b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.310288 4780 scope.go:117] "RemoveContainer" containerID="4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.310641 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\": container with ID starting with 4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5 not found: ID does not exist" containerID="4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.310665 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} err="failed to get container status \"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\": rpc error: code = NotFound desc = could not find container \"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\": container with ID starting with 4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.310681 4780 scope.go:117] "RemoveContainer" containerID="598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.310930 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\": container with ID starting with 598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05 not found: ID does not exist" containerID="598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.310953 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} err="failed to get container status \"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\": rpc error: code = NotFound desc = could not find container \"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\": container with ID starting with 598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.310971 4780 scope.go:117] "RemoveContainer" containerID="c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.311581 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\": container with ID starting with c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe not found: ID does not exist" containerID="c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.311719 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} err="failed to get container status \"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\": rpc error: code = NotFound desc = could not find container \"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\": container with ID starting with c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.311848 4780 scope.go:117] "RemoveContainer" containerID="0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.312471 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\": container with ID starting with 0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c not found: ID does not exist" containerID="0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.312518 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} err="failed to get container status \"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\": rpc error: code = NotFound desc = could not find container \"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\": container with ID starting with 0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.312554 4780 scope.go:117] "RemoveContainer" containerID="32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.312948 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\": container with ID starting with 32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549 not found: ID does not exist" containerID="32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.312993 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} err="failed to get container status \"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\": rpc error: code = NotFound desc = could not find container \"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\": container with ID starting with 32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.313033 4780 scope.go:117] "RemoveContainer" containerID="e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814" Sep 29 18:54:43 crc kubenswrapper[4780]: E0929 18:54:43.313400 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\": container with ID starting with e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814 not found: ID does not exist" containerID="e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.313431 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814"} err="failed to get container status \"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\": rpc error: code = NotFound desc = could not find container \"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\": container with ID starting with e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.313457 4780 scope.go:117] "RemoveContainer" containerID="c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.313680 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} err="failed to get container status \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": rpc error: code = NotFound desc = could not find container \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": container with ID starting with c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.313703 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.314001 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} err="failed to get container status \"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\": rpc error: code = NotFound desc = could not find container \"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\": container with ID starting with e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.314023 4780 scope.go:117] "RemoveContainer" containerID="5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.318544 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} err="failed to get container status \"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\": rpc error: code = NotFound desc = could not find container \"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\": container with ID starting with 5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.318630 4780 scope.go:117] "RemoveContainer" containerID="b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.319136 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} err="failed to get container status \"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\": rpc error: code = NotFound desc = could not find container \"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\": container with ID starting with b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.319167 4780 scope.go:117] "RemoveContainer" containerID="4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.319448 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} err="failed to get container status \"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\": rpc error: code = NotFound desc = could not find container \"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\": container with ID starting with 4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.319536 4780 scope.go:117] "RemoveContainer" containerID="598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.319814 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} err="failed to get container status \"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\": rpc error: code = NotFound desc = could not find container \"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\": container with ID starting with 598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.319839 4780 scope.go:117] "RemoveContainer" containerID="c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.320289 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} err="failed to get container status \"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\": rpc error: code = NotFound desc = could not find container \"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\": container with ID starting with c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.320366 4780 scope.go:117] "RemoveContainer" containerID="0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.320791 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} err="failed to get container status \"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\": rpc error: code = NotFound desc = could not find container \"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\": container with ID starting with 0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.320872 4780 scope.go:117] "RemoveContainer" containerID="32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.321234 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} err="failed to get container status \"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\": rpc error: code = NotFound desc = could not find container \"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\": container with ID starting with 32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.321307 4780 scope.go:117] "RemoveContainer" containerID="e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.321656 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814"} err="failed to get container status \"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\": rpc error: code = NotFound desc = could not find container \"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\": container with ID starting with e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.321726 4780 scope.go:117] "RemoveContainer" containerID="c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.322239 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} err="failed to get container status \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": rpc error: code = NotFound desc = could not find container \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": container with ID starting with c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.322324 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.335430 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} err="failed to get container status \"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\": rpc error: code = NotFound desc = could not find container \"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\": container with ID starting with e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.335525 4780 scope.go:117] "RemoveContainer" containerID="5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.336900 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} err="failed to get container status \"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\": rpc error: code = NotFound desc = could not find container \"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\": container with ID starting with 5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.336979 4780 scope.go:117] "RemoveContainer" containerID="b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.341207 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} err="failed to get container status \"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\": rpc error: code = NotFound desc = could not find container \"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\": container with ID starting with b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.341323 4780 scope.go:117] "RemoveContainer" containerID="4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.341757 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} err="failed to get container status \"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\": rpc error: code = NotFound desc = could not find container \"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\": container with ID starting with 4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.341819 4780 scope.go:117] "RemoveContainer" containerID="598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.342129 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} err="failed to get container status \"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\": rpc error: code = NotFound desc = could not find container \"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\": container with ID starting with 598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.342161 4780 scope.go:117] "RemoveContainer" containerID="c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.342679 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} err="failed to get container status \"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\": rpc error: code = NotFound desc = could not find container \"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\": container with ID starting with c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.342706 4780 scope.go:117] "RemoveContainer" containerID="0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.343864 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} err="failed to get container status \"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\": rpc error: code = NotFound desc = could not find container \"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\": container with ID starting with 0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.343897 4780 scope.go:117] "RemoveContainer" containerID="32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.344181 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} err="failed to get container status \"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\": rpc error: code = NotFound desc = could not find container \"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\": container with ID starting with 32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.344210 4780 scope.go:117] "RemoveContainer" containerID="e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.344497 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814"} err="failed to get container status \"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\": rpc error: code = NotFound desc = could not find container \"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\": container with ID starting with e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.344523 4780 scope.go:117] "RemoveContainer" containerID="c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.345000 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} err="failed to get container status \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": rpc error: code = NotFound desc = could not find container \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": container with ID starting with c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.345105 4780 scope.go:117] "RemoveContainer" containerID="e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.345420 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008"} err="failed to get container status \"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\": rpc error: code = NotFound desc = could not find container \"e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008\": container with ID starting with e958b93f7418f5b49493763c5a2620de79a29c4c099986015be6a7208a3f8008 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.345450 4780 scope.go:117] "RemoveContainer" containerID="5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.345674 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632"} err="failed to get container status \"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\": rpc error: code = NotFound desc = could not find container \"5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632\": container with ID starting with 5b9bdbbb5d0da473b7b0c0e1ff907ec9fa4ef0c88ace61175367cddb4dd24632 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.345697 4780 scope.go:117] "RemoveContainer" containerID="b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.345920 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa"} err="failed to get container status \"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\": rpc error: code = NotFound desc = could not find container \"b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa\": container with ID starting with b8ecc26030747824e69365e658fef6a1103a8b941444907236b9a6e0841396aa not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.345948 4780 scope.go:117] "RemoveContainer" containerID="4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.346182 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5"} err="failed to get container status \"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\": rpc error: code = NotFound desc = could not find container \"4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5\": container with ID starting with 4af723a6afb57a08490e3662f5c5474dbc2de3701569a2a0fa04c5ad237dcdf5 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.346206 4780 scope.go:117] "RemoveContainer" containerID="598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.346416 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05"} err="failed to get container status \"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\": rpc error: code = NotFound desc = could not find container \"598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05\": container with ID starting with 598c4bc7b63eed574e2951176aa4d6d77c864fded79a4abc1e26033b627b8c05 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.346452 4780 scope.go:117] "RemoveContainer" containerID="c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.346668 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe"} err="failed to get container status \"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\": rpc error: code = NotFound desc = could not find container \"c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe\": container with ID starting with c1ebcb1ca7ac7d4f22b2bea452f3da4341460b61584295f4e1e977069116a4fe not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.346697 4780 scope.go:117] "RemoveContainer" containerID="0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.346898 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c"} err="failed to get container status \"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\": rpc error: code = NotFound desc = could not find container \"0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c\": container with ID starting with 0bbc49e98339c2fb7ecdc464c9d7eedb783829025b2689de6c55ec4255ff612c not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.346922 4780 scope.go:117] "RemoveContainer" containerID="32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.347119 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549"} err="failed to get container status \"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\": rpc error: code = NotFound desc = could not find container \"32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549\": container with ID starting with 32b733125b5ba0b7898bb0c9d1c82f647963eebe273177d0bc99b7e4a6f45549 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.347140 4780 scope.go:117] "RemoveContainer" containerID="e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.347401 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814"} err="failed to get container status \"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\": rpc error: code = NotFound desc = could not find container \"e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814\": container with ID starting with e04528f68dc290ba8a7e90dd067d13728168952e0c87cc14144f5d3f7a3a4814 not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.347424 4780 scope.go:117] "RemoveContainer" containerID="c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.347652 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f"} err="failed to get container status \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": rpc error: code = NotFound desc = could not find container \"c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f\": container with ID starting with c4e9cd7fa9236d7f882f58d081a23f77388068c226dba62852ba9866d7f02c5f not found: ID does not exist" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.974640 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" event={"ID":"eca29428-21f3-4e84-81b2-a42e23d90c23","Type":"ContainerDied","Data":"798578fed354cefed306e444e00d71949d2ee9d66308d75ed8d6d6d7078d9a32"} Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.974690 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="798578fed354cefed306e444e00d71949d2ee9d66308d75ed8d6d6d7078d9a32" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.974717 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr" Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.978449 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerStarted","Data":"75986afc04cda6b4d7606d9c8081c5e232dedb378bcbd1b765efed20317474fd"} Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.978508 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerStarted","Data":"8c2bacc077cec4f50d0f7ca8b747b62c6a681d997f5cc5d5fbc4d4f5bcf01292"} Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.978524 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerStarted","Data":"7e9dfe87992c807918b98bf910ed425d08cd8e850a497bf0a6258b997393f66d"} Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.978537 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerStarted","Data":"e5bdeb93f75c49800ebb9f84ff8ef754a56890041eec25aad2ccc365757d7fbf"} Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.978578 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerStarted","Data":"924f6bdff7fa88920839e43ecd04d8a96f78a8baaf1237c1da234bb94121ee40"} Sep 29 18:54:43 crc kubenswrapper[4780]: I0929 18:54:43.978591 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerStarted","Data":"cd98547eab28db979eaeb42fe646d7d988c9dec7d9c1580f933f1ef06e442972"} Sep 29 18:54:44 crc kubenswrapper[4780]: I0929 18:54:44.760623 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a328df-2763-44f9-9512-3abb64ef45aa" path="/var/lib/kubelet/pods/43a328df-2763-44f9-9512-3abb64ef45aa/volumes" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.186287 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq"] Sep 29 18:54:45 crc kubenswrapper[4780]: E0929 18:54:45.186558 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerName="extract" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.186571 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerName="extract" Sep 29 18:54:45 crc kubenswrapper[4780]: E0929 18:54:45.186587 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerName="util" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.186593 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerName="util" Sep 29 18:54:45 crc kubenswrapper[4780]: E0929 18:54:45.186611 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerName="pull" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.186617 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerName="pull" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.186715 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca29428-21f3-4e84-81b2-a42e23d90c23" containerName="extract" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.187152 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.189083 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.189186 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-h7rkd" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.190420 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.321687 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2rjs\" (UniqueName: \"kubernetes.io/projected/c110df1d-a352-4af1-9b48-3d68bd11f230-kube-api-access-t2rjs\") pod \"nmstate-operator-5d6f6cfd66-qtlkq\" (UID: \"c110df1d-a352-4af1-9b48-3d68bd11f230\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.422873 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2rjs\" (UniqueName: \"kubernetes.io/projected/c110df1d-a352-4af1-9b48-3d68bd11f230-kube-api-access-t2rjs\") pod \"nmstate-operator-5d6f6cfd66-qtlkq\" (UID: \"c110df1d-a352-4af1-9b48-3d68bd11f230\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.444106 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2rjs\" (UniqueName: \"kubernetes.io/projected/c110df1d-a352-4af1-9b48-3d68bd11f230-kube-api-access-t2rjs\") pod \"nmstate-operator-5d6f6cfd66-qtlkq\" (UID: \"c110df1d-a352-4af1-9b48-3d68bd11f230\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:45 crc kubenswrapper[4780]: I0929 18:54:45.515132 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:45 crc kubenswrapper[4780]: E0929 18:54:45.549311 4780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(71e6e4f88d274a7e6b2bb44de429d7d11f4c27d1c2c468b19116a3bc42fce6fc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 18:54:45 crc kubenswrapper[4780]: E0929 18:54:45.549428 4780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(71e6e4f88d274a7e6b2bb44de429d7d11f4c27d1c2c468b19116a3bc42fce6fc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:45 crc kubenswrapper[4780]: E0929 18:54:45.549472 4780 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(71e6e4f88d274a7e6b2bb44de429d7d11f4c27d1c2c468b19116a3bc42fce6fc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:45 crc kubenswrapper[4780]: E0929 18:54:45.549543 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate(c110df1d-a352-4af1-9b48-3d68bd11f230)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate(c110df1d-a352-4af1-9b48-3d68bd11f230)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(71e6e4f88d274a7e6b2bb44de429d7d11f4c27d1c2c468b19116a3bc42fce6fc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" podUID="c110df1d-a352-4af1-9b48-3d68bd11f230" Sep 29 18:54:47 crc kubenswrapper[4780]: I0929 18:54:47.001576 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerStarted","Data":"63b24127d80dbec803dca5846c240ebf89bcf1955206e8c98e0f39d2ee841234"} Sep 29 18:54:49 crc kubenswrapper[4780]: I0929 18:54:49.019069 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" event={"ID":"411bf8ba-9d7a-45e5-8e6e-3e2645157dbe","Type":"ContainerStarted","Data":"2b0e0da3478b709de80e5e91ff87aa966c0a24098d8eb97935ad019ee19acc10"} Sep 29 18:54:49 crc kubenswrapper[4780]: I0929 18:54:49.019632 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:49 crc kubenswrapper[4780]: I0929 18:54:49.049810 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:49 crc kubenswrapper[4780]: I0929 18:54:49.055670 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" podStartSLOduration=7.055643489 podStartE2EDuration="7.055643489s" podCreationTimestamp="2025-09-29 18:54:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:54:49.052333745 +0000 UTC m=+689.000631799" watchObservedRunningTime="2025-09-29 18:54:49.055643489 +0000 UTC m=+689.003941543" Sep 29 18:54:49 crc kubenswrapper[4780]: I0929 18:54:49.123815 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq"] Sep 29 18:54:49 crc kubenswrapper[4780]: I0929 18:54:49.123971 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:49 crc kubenswrapper[4780]: I0929 18:54:49.124494 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:49 crc kubenswrapper[4780]: E0929 18:54:49.162134 4780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(0876f129bc0661f267a4cb93815d190aa42eb9f0a1ff7dfef816bd80a2c29ff9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 18:54:49 crc kubenswrapper[4780]: E0929 18:54:49.162231 4780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(0876f129bc0661f267a4cb93815d190aa42eb9f0a1ff7dfef816bd80a2c29ff9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:49 crc kubenswrapper[4780]: E0929 18:54:49.162274 4780 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(0876f129bc0661f267a4cb93815d190aa42eb9f0a1ff7dfef816bd80a2c29ff9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:54:49 crc kubenswrapper[4780]: E0929 18:54:49.162336 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate(c110df1d-a352-4af1-9b48-3d68bd11f230)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate(c110df1d-a352-4af1-9b48-3d68bd11f230)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(0876f129bc0661f267a4cb93815d190aa42eb9f0a1ff7dfef816bd80a2c29ff9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" podUID="c110df1d-a352-4af1-9b48-3d68bd11f230" Sep 29 18:54:50 crc kubenswrapper[4780]: I0929 18:54:50.025101 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:50 crc kubenswrapper[4780]: I0929 18:54:50.025519 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:50 crc kubenswrapper[4780]: I0929 18:54:50.062567 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:54:53 crc kubenswrapper[4780]: I0929 18:54:53.753713 4780 scope.go:117] "RemoveContainer" containerID="9e72eed2874a3197d3024f6117b220b2d4dcab94b6f2a290f9d2866bd48d86fd" Sep 29 18:54:53 crc kubenswrapper[4780]: E0929 18:54:53.755411 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wc8rf_openshift-multus(2c2af9fc-5cef-48e3-8070-cf2767bc4a81)\"" pod="openshift-multus/multus-wc8rf" podUID="2c2af9fc-5cef-48e3-8070-cf2767bc4a81" Sep 29 18:55:01 crc kubenswrapper[4780]: I0929 18:55:01.752223 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:55:01 crc kubenswrapper[4780]: I0929 18:55:01.753707 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:55:01 crc kubenswrapper[4780]: E0929 18:55:01.782358 4780 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(c8f3bda69f93c5e9360a9bf1bd6bfeb2a93e34c7dec1e42679043a21bf125754): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 29 18:55:01 crc kubenswrapper[4780]: E0929 18:55:01.782449 4780 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(c8f3bda69f93c5e9360a9bf1bd6bfeb2a93e34c7dec1e42679043a21bf125754): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:55:01 crc kubenswrapper[4780]: E0929 18:55:01.782478 4780 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(c8f3bda69f93c5e9360a9bf1bd6bfeb2a93e34c7dec1e42679043a21bf125754): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:55:01 crc kubenswrapper[4780]: E0929 18:55:01.782537 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate(c110df1d-a352-4af1-9b48-3d68bd11f230)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate(c110df1d-a352-4af1-9b48-3d68bd11f230)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5d6f6cfd66-qtlkq_openshift-nmstate_c110df1d-a352-4af1-9b48-3d68bd11f230_0(c8f3bda69f93c5e9360a9bf1bd6bfeb2a93e34c7dec1e42679043a21bf125754): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" podUID="c110df1d-a352-4af1-9b48-3d68bd11f230" Sep 29 18:55:03 crc kubenswrapper[4780]: I0929 18:55:03.222992 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:55:03 crc kubenswrapper[4780]: I0929 18:55:03.223111 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:55:08 crc kubenswrapper[4780]: I0929 18:55:08.753726 4780 scope.go:117] "RemoveContainer" containerID="9e72eed2874a3197d3024f6117b220b2d4dcab94b6f2a290f9d2866bd48d86fd" Sep 29 18:55:09 crc kubenswrapper[4780]: I0929 18:55:09.147481 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/2.log" Sep 29 18:55:09 crc kubenswrapper[4780]: I0929 18:55:09.149383 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/1.log" Sep 29 18:55:09 crc kubenswrapper[4780]: I0929 18:55:09.149455 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wc8rf" event={"ID":"2c2af9fc-5cef-48e3-8070-cf2767bc4a81","Type":"ContainerStarted","Data":"92c43b1a43d2d997c1a64cb5dbbf66d217950c1b630cf37affb505cfbb91be65"} Sep 29 18:55:12 crc kubenswrapper[4780]: I0929 18:55:12.806404 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mf925" Sep 29 18:55:14 crc kubenswrapper[4780]: I0929 18:55:14.753099 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:55:14 crc kubenswrapper[4780]: I0929 18:55:14.753696 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" Sep 29 18:55:14 crc kubenswrapper[4780]: I0929 18:55:14.944972 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq"] Sep 29 18:55:15 crc kubenswrapper[4780]: I0929 18:55:15.199551 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" event={"ID":"c110df1d-a352-4af1-9b48-3d68bd11f230","Type":"ContainerStarted","Data":"13f762fb4765af87122d2f4da2416a9b3dca2c4a51a6f78ab6ae1352ffdc8038"} Sep 29 18:55:18 crc kubenswrapper[4780]: I0929 18:55:18.220530 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" event={"ID":"c110df1d-a352-4af1-9b48-3d68bd11f230","Type":"ContainerStarted","Data":"a91456f58d4a4630966e02c5eb27589a3195ec8d581dc701b93e17c026245b0f"} Sep 29 18:55:18 crc kubenswrapper[4780]: I0929 18:55:18.255529 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-qtlkq" podStartSLOduration=30.991215617 podStartE2EDuration="33.25550819s" podCreationTimestamp="2025-09-29 18:54:45 +0000 UTC" firstStartedPulling="2025-09-29 18:55:14.957978639 +0000 UTC m=+714.906276683" lastFinishedPulling="2025-09-29 18:55:17.222271212 +0000 UTC m=+717.170569256" observedRunningTime="2025-09-29 18:55:18.253618114 +0000 UTC m=+718.201916158" watchObservedRunningTime="2025-09-29 18:55:18.25550819 +0000 UTC m=+718.203806234" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.276774 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.278015 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.280739 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-gq859" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.294264 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-dq26q"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.295186 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.300331 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.300920 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.307309 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zdk8n"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.308318 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.326025 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-dq26q"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.420301 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.421128 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.424006 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.424216 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qt6kk" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.424646 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.428004 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bd2d744d-8c52-4b60-a8e0-95999db053fc-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-dq26q\" (UID: \"bd2d744d-8c52-4b60-a8e0-95999db053fc\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.428072 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-ovs-socket\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.428093 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-dbus-socket\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.428119 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6tjx\" (UniqueName: \"kubernetes.io/projected/bd2d744d-8c52-4b60-a8e0-95999db053fc-kube-api-access-r6tjx\") pod \"nmstate-webhook-6d689559c5-dq26q\" (UID: \"bd2d744d-8c52-4b60-a8e0-95999db053fc\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.428142 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-nmstate-lock\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.428169 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c724q\" (UniqueName: \"kubernetes.io/projected/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-kube-api-access-c724q\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.428197 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r86v\" (UniqueName: \"kubernetes.io/projected/8630f421-2559-4a68-9f18-4eed4e760add-kube-api-access-8r86v\") pod \"nmstate-metrics-58fcddf996-tbxfv\" (UID: \"8630f421-2559-4a68-9f18-4eed4e760add\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.431337 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530094 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bd2d744d-8c52-4b60-a8e0-95999db053fc-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-dq26q\" (UID: \"bd2d744d-8c52-4b60-a8e0-95999db053fc\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530450 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aca73d30-931f-40bb-8af6-ca484c734840-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530575 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aca73d30-931f-40bb-8af6-ca484c734840-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530618 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-ovs-socket\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530638 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-dbus-socket\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530664 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6tjx\" (UniqueName: \"kubernetes.io/projected/bd2d744d-8c52-4b60-a8e0-95999db053fc-kube-api-access-r6tjx\") pod \"nmstate-webhook-6d689559c5-dq26q\" (UID: \"bd2d744d-8c52-4b60-a8e0-95999db053fc\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530686 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-nmstate-lock\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530737 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-ovs-socket\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530804 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-nmstate-lock\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.530939 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c724q\" (UniqueName: \"kubernetes.io/projected/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-kube-api-access-c724q\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.531087 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-dbus-socket\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.531169 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r86v\" (UniqueName: \"kubernetes.io/projected/8630f421-2559-4a68-9f18-4eed4e760add-kube-api-access-8r86v\") pod \"nmstate-metrics-58fcddf996-tbxfv\" (UID: \"8630f421-2559-4a68-9f18-4eed4e760add\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.531290 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tssxq\" (UniqueName: \"kubernetes.io/projected/aca73d30-931f-40bb-8af6-ca484c734840-kube-api-access-tssxq\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.546603 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bd2d744d-8c52-4b60-a8e0-95999db053fc-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-dq26q\" (UID: \"bd2d744d-8c52-4b60-a8e0-95999db053fc\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.549649 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6tjx\" (UniqueName: \"kubernetes.io/projected/bd2d744d-8c52-4b60-a8e0-95999db053fc-kube-api-access-r6tjx\") pod \"nmstate-webhook-6d689559c5-dq26q\" (UID: \"bd2d744d-8c52-4b60-a8e0-95999db053fc\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.563547 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c724q\" (UniqueName: \"kubernetes.io/projected/cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a-kube-api-access-c724q\") pod \"nmstate-handler-zdk8n\" (UID: \"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a\") " pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.566923 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r86v\" (UniqueName: \"kubernetes.io/projected/8630f421-2559-4a68-9f18-4eed4e760add-kube-api-access-8r86v\") pod \"nmstate-metrics-58fcddf996-tbxfv\" (UID: \"8630f421-2559-4a68-9f18-4eed4e760add\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.598151 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.627269 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.633198 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tssxq\" (UniqueName: \"kubernetes.io/projected/aca73d30-931f-40bb-8af6-ca484c734840-kube-api-access-tssxq\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.633486 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aca73d30-931f-40bb-8af6-ca484c734840-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.633598 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aca73d30-931f-40bb-8af6-ca484c734840-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:19 crc kubenswrapper[4780]: E0929 18:55:19.633826 4780 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Sep 29 18:55:19 crc kubenswrapper[4780]: E0929 18:55:19.633989 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca73d30-931f-40bb-8af6-ca484c734840-plugin-serving-cert podName:aca73d30-931f-40bb-8af6-ca484c734840 nodeName:}" failed. No retries permitted until 2025-09-29 18:55:20.133929911 +0000 UTC m=+720.082227955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/aca73d30-931f-40bb-8af6-ca484c734840-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-xq6qz" (UID: "aca73d30-931f-40bb-8af6-ca484c734840") : secret "plugin-serving-cert" not found Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.635426 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aca73d30-931f-40bb-8af6-ca484c734840-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.635742 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.639213 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-86bd7dcbd7-d75tf"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.639927 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.664369 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86bd7dcbd7-d75tf"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.666632 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tssxq\" (UniqueName: \"kubernetes.io/projected/aca73d30-931f-40bb-8af6-ca484c734840-kube-api-access-tssxq\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.735018 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-service-ca\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.735127 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-oauth-serving-cert\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.735190 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-console-config\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.735216 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/babdc1a5-fa4e-4973-ab50-8a996f020dfd-console-oauth-config\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.735238 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-trusted-ca-bundle\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.735273 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdxgl\" (UniqueName: \"kubernetes.io/projected/babdc1a5-fa4e-4973-ab50-8a996f020dfd-kube-api-access-fdxgl\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.735347 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/babdc1a5-fa4e-4973-ab50-8a996f020dfd-console-serving-cert\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.836126 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/babdc1a5-fa4e-4973-ab50-8a996f020dfd-console-serving-cert\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.836188 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-service-ca\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.836254 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-oauth-serving-cert\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.836299 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-console-config\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.836319 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/babdc1a5-fa4e-4973-ab50-8a996f020dfd-console-oauth-config\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.836339 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-trusted-ca-bundle\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.836366 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdxgl\" (UniqueName: \"kubernetes.io/projected/babdc1a5-fa4e-4973-ab50-8a996f020dfd-kube-api-access-fdxgl\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.838713 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-console-config\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.839751 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-oauth-serving-cert\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.839947 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-trusted-ca-bundle\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.840159 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/babdc1a5-fa4e-4973-ab50-8a996f020dfd-service-ca\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.844160 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/babdc1a5-fa4e-4973-ab50-8a996f020dfd-console-oauth-config\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.845213 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/babdc1a5-fa4e-4973-ab50-8a996f020dfd-console-serving-cert\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.856140 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdxgl\" (UniqueName: \"kubernetes.io/projected/babdc1a5-fa4e-4973-ab50-8a996f020dfd-kube-api-access-fdxgl\") pod \"console-86bd7dcbd7-d75tf\" (UID: \"babdc1a5-fa4e-4973-ab50-8a996f020dfd\") " pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.861028 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.911356 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-dq26q"] Sep 29 18:55:19 crc kubenswrapper[4780]: I0929 18:55:19.975187 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.140126 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aca73d30-931f-40bb-8af6-ca484c734840-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.146151 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aca73d30-931f-40bb-8af6-ca484c734840-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-xq6qz\" (UID: \"aca73d30-931f-40bb-8af6-ca484c734840\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.180921 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86bd7dcbd7-d75tf"] Sep 29 18:55:20 crc kubenswrapper[4780]: W0929 18:55:20.191784 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbabdc1a5_fa4e_4973_ab50_8a996f020dfd.slice/crio-7bb093e7aad316174af658a6047023f00bc5d5d4304034811a7cdbf1710aca60 WatchSource:0}: Error finding container 7bb093e7aad316174af658a6047023f00bc5d5d4304034811a7cdbf1710aca60: Status 404 returned error can't find the container with id 7bb093e7aad316174af658a6047023f00bc5d5d4304034811a7cdbf1710aca60 Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.247419 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bd7dcbd7-d75tf" event={"ID":"babdc1a5-fa4e-4973-ab50-8a996f020dfd","Type":"ContainerStarted","Data":"7bb093e7aad316174af658a6047023f00bc5d5d4304034811a7cdbf1710aca60"} Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.248826 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" event={"ID":"bd2d744d-8c52-4b60-a8e0-95999db053fc","Type":"ContainerStarted","Data":"8a882eea7b34bf90a7ad4125db40435c749e77627cf97b5680632a32dbd5ea12"} Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.249891 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv" event={"ID":"8630f421-2559-4a68-9f18-4eed4e760add","Type":"ContainerStarted","Data":"3b66e2b1e2831b20a1291b8b230a36fb5c1905708894afae1b92f578636408c3"} Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.250973 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zdk8n" event={"ID":"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a","Type":"ContainerStarted","Data":"50ec0a3905bcf0be91e432a9b3f4a50377ccfce235ea6d0004d38a68ed063456"} Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.339195 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.800299 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz"] Sep 29 18:55:20 crc kubenswrapper[4780]: I0929 18:55:20.966663 4780 scope.go:117] "RemoveContainer" containerID="bca58d730b0dc872dd1e4792973c7e8d5a70988f3d9c5ea68a5383998a3a8b0d" Sep 29 18:55:21 crc kubenswrapper[4780]: I0929 18:55:21.260309 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bd7dcbd7-d75tf" event={"ID":"babdc1a5-fa4e-4973-ab50-8a996f020dfd","Type":"ContainerStarted","Data":"1a530890884aef66abe1c453ee7cd3d630110a5c5f41f105e2c8802a4ae2cfb6"} Sep 29 18:55:21 crc kubenswrapper[4780]: I0929 18:55:21.262424 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" event={"ID":"aca73d30-931f-40bb-8af6-ca484c734840","Type":"ContainerStarted","Data":"ed67c181dce288ff268a943009401dab9f4af9098814623c1af13aa320f682c8"} Sep 29 18:55:21 crc kubenswrapper[4780]: I0929 18:55:21.266032 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wc8rf_2c2af9fc-5cef-48e3-8070-cf2767bc4a81/kube-multus/2.log" Sep 29 18:55:21 crc kubenswrapper[4780]: I0929 18:55:21.284793 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86bd7dcbd7-d75tf" podStartSLOduration=2.2847694179999998 podStartE2EDuration="2.284769418s" podCreationTimestamp="2025-09-29 18:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:55:21.282589303 +0000 UTC m=+721.230887357" watchObservedRunningTime="2025-09-29 18:55:21.284769418 +0000 UTC m=+721.233067462" Sep 29 18:55:23 crc kubenswrapper[4780]: I0929 18:55:23.283329 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zdk8n" event={"ID":"cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a","Type":"ContainerStarted","Data":"da60f5ed3544018e4d78e7c53673db8e9d71a7af429e3f860fcade9157bde230"} Sep 29 18:55:23 crc kubenswrapper[4780]: I0929 18:55:23.284251 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:23 crc kubenswrapper[4780]: I0929 18:55:23.286661 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" event={"ID":"bd2d744d-8c52-4b60-a8e0-95999db053fc","Type":"ContainerStarted","Data":"0ec5f2b918af8f6e78c68f969cdad982545315571b963c8ece49a02292d8f038"} Sep 29 18:55:23 crc kubenswrapper[4780]: I0929 18:55:23.286779 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:23 crc kubenswrapper[4780]: I0929 18:55:23.288472 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv" event={"ID":"8630f421-2559-4a68-9f18-4eed4e760add","Type":"ContainerStarted","Data":"156bd3c8d6b8db4641dca505ee7d68ab329f6b8a2f340caf870d0bdfc009f659"} Sep 29 18:55:23 crc kubenswrapper[4780]: I0929 18:55:23.303063 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zdk8n" podStartSLOduration=1.444120253 podStartE2EDuration="4.30302873s" podCreationTimestamp="2025-09-29 18:55:19 +0000 UTC" firstStartedPulling="2025-09-29 18:55:19.691499371 +0000 UTC m=+719.639797425" lastFinishedPulling="2025-09-29 18:55:22.550407868 +0000 UTC m=+722.498705902" observedRunningTime="2025-09-29 18:55:23.298690192 +0000 UTC m=+723.246988266" watchObservedRunningTime="2025-09-29 18:55:23.30302873 +0000 UTC m=+723.251326774" Sep 29 18:55:24 crc kubenswrapper[4780]: I0929 18:55:24.307526 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" event={"ID":"aca73d30-931f-40bb-8af6-ca484c734840","Type":"ContainerStarted","Data":"8c6ec5c7e82d151fc620c0c11428ab71498372a453c15c0edcddac7ac219f757"} Sep 29 18:55:24 crc kubenswrapper[4780]: I0929 18:55:24.331849 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" podStartSLOduration=2.702639115 podStartE2EDuration="5.331821688s" podCreationTimestamp="2025-09-29 18:55:19 +0000 UTC" firstStartedPulling="2025-09-29 18:55:19.918210346 +0000 UTC m=+719.866508390" lastFinishedPulling="2025-09-29 18:55:22.547392929 +0000 UTC m=+722.495690963" observedRunningTime="2025-09-29 18:55:23.318849434 +0000 UTC m=+723.267147478" watchObservedRunningTime="2025-09-29 18:55:24.331821688 +0000 UTC m=+724.280119732" Sep 29 18:55:24 crc kubenswrapper[4780]: I0929 18:55:24.332299 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-xq6qz" podStartSLOduration=2.645560049 podStartE2EDuration="5.332294452s" podCreationTimestamp="2025-09-29 18:55:19 +0000 UTC" firstStartedPulling="2025-09-29 18:55:20.821582332 +0000 UTC m=+720.769880376" lastFinishedPulling="2025-09-29 18:55:23.508316735 +0000 UTC m=+723.456614779" observedRunningTime="2025-09-29 18:55:24.323856294 +0000 UTC m=+724.272154338" watchObservedRunningTime="2025-09-29 18:55:24.332294452 +0000 UTC m=+724.280592496" Sep 29 18:55:25 crc kubenswrapper[4780]: I0929 18:55:25.317929 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv" event={"ID":"8630f421-2559-4a68-9f18-4eed4e760add","Type":"ContainerStarted","Data":"04a3287470afdf2fc20b8458c6611df1bd78aa78819a97343ecc2892a08aa03b"} Sep 29 18:55:29 crc kubenswrapper[4780]: I0929 18:55:29.674185 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zdk8n" Sep 29 18:55:29 crc kubenswrapper[4780]: I0929 18:55:29.704133 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tbxfv" podStartSLOduration=5.658085624 podStartE2EDuration="10.704106629s" podCreationTimestamp="2025-09-29 18:55:19 +0000 UTC" firstStartedPulling="2025-09-29 18:55:19.869960259 +0000 UTC m=+719.818258303" lastFinishedPulling="2025-09-29 18:55:24.915981264 +0000 UTC m=+724.864279308" observedRunningTime="2025-09-29 18:55:25.340566586 +0000 UTC m=+725.288864650" watchObservedRunningTime="2025-09-29 18:55:29.704106629 +0000 UTC m=+729.652404693" Sep 29 18:55:29 crc kubenswrapper[4780]: I0929 18:55:29.975639 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:29 crc kubenswrapper[4780]: I0929 18:55:29.976246 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:29 crc kubenswrapper[4780]: I0929 18:55:29.981998 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:30 crc kubenswrapper[4780]: I0929 18:55:30.359951 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86bd7dcbd7-d75tf" Sep 29 18:55:30 crc kubenswrapper[4780]: I0929 18:55:30.420147 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-77bv2"] Sep 29 18:55:33 crc kubenswrapper[4780]: I0929 18:55:33.223233 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:55:33 crc kubenswrapper[4780]: I0929 18:55:33.223739 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:55:39 crc kubenswrapper[4780]: I0929 18:55:39.634030 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-dq26q" Sep 29 18:55:49 crc kubenswrapper[4780]: I0929 18:55:49.500826 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vr2qc"] Sep 29 18:55:49 crc kubenswrapper[4780]: I0929 18:55:49.501883 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" podUID="51e4222f-7fd5-41eb-afcc-832602668ada" containerName="controller-manager" containerID="cri-o://47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0" gracePeriod=30 Sep 29 18:55:49 crc kubenswrapper[4780]: I0929 18:55:49.582773 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s"] Sep 29 18:55:49 crc kubenswrapper[4780]: I0929 18:55:49.583118 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" podUID="8f7506da-aefc-4178-b6a2-408e686c8040" containerName="route-controller-manager" containerID="cri-o://a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482" gracePeriod=30 Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.007142 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.013292 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.038249 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51e4222f-7fd5-41eb-afcc-832602668ada-serving-cert\") pod \"51e4222f-7fd5-41eb-afcc-832602668ada\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.038333 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-client-ca\") pod \"51e4222f-7fd5-41eb-afcc-832602668ada\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.039198 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9wd6\" (UniqueName: \"kubernetes.io/projected/51e4222f-7fd5-41eb-afcc-832602668ada-kube-api-access-m9wd6\") pod \"51e4222f-7fd5-41eb-afcc-832602668ada\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.039269 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7506da-aefc-4178-b6a2-408e686c8040-serving-cert\") pod \"8f7506da-aefc-4178-b6a2-408e686c8040\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.039366 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-proxy-ca-bundles\") pod \"51e4222f-7fd5-41eb-afcc-832602668ada\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.039398 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-client-ca\") pod \"8f7506da-aefc-4178-b6a2-408e686c8040\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.039431 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-config\") pod \"8f7506da-aefc-4178-b6a2-408e686c8040\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.039462 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-config\") pod \"51e4222f-7fd5-41eb-afcc-832602668ada\" (UID: \"51e4222f-7fd5-41eb-afcc-832602668ada\") " Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.039481 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-689rt\" (UniqueName: \"kubernetes.io/projected/8f7506da-aefc-4178-b6a2-408e686c8040-kube-api-access-689rt\") pod \"8f7506da-aefc-4178-b6a2-408e686c8040\" (UID: \"8f7506da-aefc-4178-b6a2-408e686c8040\") " Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.040967 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-client-ca" (OuterVolumeSpecName: "client-ca") pod "51e4222f-7fd5-41eb-afcc-832602668ada" (UID: "51e4222f-7fd5-41eb-afcc-832602668ada"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.042211 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-client-ca" (OuterVolumeSpecName: "client-ca") pod "8f7506da-aefc-4178-b6a2-408e686c8040" (UID: "8f7506da-aefc-4178-b6a2-408e686c8040"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.042395 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-config" (OuterVolumeSpecName: "config") pod "8f7506da-aefc-4178-b6a2-408e686c8040" (UID: "8f7506da-aefc-4178-b6a2-408e686c8040"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.042830 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "51e4222f-7fd5-41eb-afcc-832602668ada" (UID: "51e4222f-7fd5-41eb-afcc-832602668ada"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.045406 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-config" (OuterVolumeSpecName: "config") pod "51e4222f-7fd5-41eb-afcc-832602668ada" (UID: "51e4222f-7fd5-41eb-afcc-832602668ada"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.049790 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f7506da-aefc-4178-b6a2-408e686c8040-kube-api-access-689rt" (OuterVolumeSpecName: "kube-api-access-689rt") pod "8f7506da-aefc-4178-b6a2-408e686c8040" (UID: "8f7506da-aefc-4178-b6a2-408e686c8040"). InnerVolumeSpecName "kube-api-access-689rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.051655 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7506da-aefc-4178-b6a2-408e686c8040-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8f7506da-aefc-4178-b6a2-408e686c8040" (UID: "8f7506da-aefc-4178-b6a2-408e686c8040"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.051784 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e4222f-7fd5-41eb-afcc-832602668ada-kube-api-access-m9wd6" (OuterVolumeSpecName: "kube-api-access-m9wd6") pod "51e4222f-7fd5-41eb-afcc-832602668ada" (UID: "51e4222f-7fd5-41eb-afcc-832602668ada"). InnerVolumeSpecName "kube-api-access-m9wd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.056539 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e4222f-7fd5-41eb-afcc-832602668ada-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51e4222f-7fd5-41eb-afcc-832602668ada" (UID: "51e4222f-7fd5-41eb-afcc-832602668ada"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.140995 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.141419 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-689rt\" (UniqueName: \"kubernetes.io/projected/8f7506da-aefc-4178-b6a2-408e686c8040-kube-api-access-689rt\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.141439 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51e4222f-7fd5-41eb-afcc-832602668ada-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.141453 4780 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.141468 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9wd6\" (UniqueName: \"kubernetes.io/projected/51e4222f-7fd5-41eb-afcc-832602668ada-kube-api-access-m9wd6\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.141480 4780 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f7506da-aefc-4178-b6a2-408e686c8040-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.141493 4780 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51e4222f-7fd5-41eb-afcc-832602668ada-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.141506 4780 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-client-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.141516 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7506da-aefc-4178-b6a2-408e686c8040-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.494825 4780 generic.go:334] "Generic (PLEG): container finished" podID="8f7506da-aefc-4178-b6a2-408e686c8040" containerID="a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482" exitCode=0 Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.495477 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" event={"ID":"8f7506da-aefc-4178-b6a2-408e686c8040","Type":"ContainerDied","Data":"a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482"} Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.495519 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" event={"ID":"8f7506da-aefc-4178-b6a2-408e686c8040","Type":"ContainerDied","Data":"9883ee86b687d3441df05a4f57db15305a84967dea4ab43b2d45aa4db294933e"} Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.495541 4780 scope.go:117] "RemoveContainer" containerID="a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.495688 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.498889 4780 generic.go:334] "Generic (PLEG): container finished" podID="51e4222f-7fd5-41eb-afcc-832602668ada" containerID="47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0" exitCode=0 Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.498988 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" event={"ID":"51e4222f-7fd5-41eb-afcc-832602668ada","Type":"ContainerDied","Data":"47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0"} Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.499064 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" event={"ID":"51e4222f-7fd5-41eb-afcc-832602668ada","Type":"ContainerDied","Data":"1339325007e9539dc229362367cfee2f2606d94b6299f36fb5ceb0f17cefe24c"} Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.498937 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vr2qc" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.526853 4780 scope.go:117] "RemoveContainer" containerID="a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482" Sep 29 18:55:50 crc kubenswrapper[4780]: E0929 18:55:50.533397 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482\": container with ID starting with a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482 not found: ID does not exist" containerID="a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.533700 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482"} err="failed to get container status \"a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482\": rpc error: code = NotFound desc = could not find container \"a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482\": container with ID starting with a0c7c2d22426f3360ff49138e2619ac85d5b37d0258fc7090393aee5c98f0482 not found: ID does not exist" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.533778 4780 scope.go:117] "RemoveContainer" containerID="47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.537693 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vr2qc"] Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.540317 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vr2qc"] Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.550962 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s"] Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.553581 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8l7s"] Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.558173 4780 scope.go:117] "RemoveContainer" containerID="47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0" Sep 29 18:55:50 crc kubenswrapper[4780]: E0929 18:55:50.558973 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0\": container with ID starting with 47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0 not found: ID does not exist" containerID="47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.559012 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0"} err="failed to get container status \"47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0\": rpc error: code = NotFound desc = could not find container \"47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0\": container with ID starting with 47bfba40fdf77d74a657ad2a53cebe12530d35483185535268a6f182f6d352b0 not found: ID does not exist" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.762619 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51e4222f-7fd5-41eb-afcc-832602668ada" path="/var/lib/kubelet/pods/51e4222f-7fd5-41eb-afcc-832602668ada/volumes" Sep 29 18:55:50 crc kubenswrapper[4780]: I0929 18:55:50.763971 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f7506da-aefc-4178-b6a2-408e686c8040" path="/var/lib/kubelet/pods/8f7506da-aefc-4178-b6a2-408e686c8040/volumes" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.463198 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7949f545d-wh24t"] Sep 29 18:55:51 crc kubenswrapper[4780]: E0929 18:55:51.464115 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7506da-aefc-4178-b6a2-408e686c8040" containerName="route-controller-manager" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.464133 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7506da-aefc-4178-b6a2-408e686c8040" containerName="route-controller-manager" Sep 29 18:55:51 crc kubenswrapper[4780]: E0929 18:55:51.464152 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e4222f-7fd5-41eb-afcc-832602668ada" containerName="controller-manager" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.464160 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e4222f-7fd5-41eb-afcc-832602668ada" containerName="controller-manager" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.464288 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f7506da-aefc-4178-b6a2-408e686c8040" containerName="route-controller-manager" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.464305 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e4222f-7fd5-41eb-afcc-832602668ada" containerName="controller-manager" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.464865 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.465837 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq"] Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.466751 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.468809 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.469209 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.472565 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.473386 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.473601 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.473756 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.473851 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.473965 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.474024 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.474162 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.474370 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.478119 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7949f545d-wh24t"] Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.478807 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.480943 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.487629 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq"] Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.664069 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vkb\" (UniqueName: \"kubernetes.io/projected/d7bc338b-082c-432b-9936-e5756ce65df9-kube-api-access-g6vkb\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.664141 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7bc338b-082c-432b-9936-e5756ce65df9-client-ca\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.664182 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7bc338b-082c-432b-9936-e5756ce65df9-proxy-ca-bundles\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.664206 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d849762-e461-4e8b-816e-830e15e363b2-config\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.664346 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d849762-e461-4e8b-816e-830e15e363b2-client-ca\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.664460 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bc338b-082c-432b-9936-e5756ce65df9-config\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.664530 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bc338b-082c-432b-9936-e5756ce65df9-serving-cert\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.664552 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d849762-e461-4e8b-816e-830e15e363b2-serving-cert\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.664673 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzbq2\" (UniqueName: \"kubernetes.io/projected/6d849762-e461-4e8b-816e-830e15e363b2-kube-api-access-lzbq2\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.766411 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d849762-e461-4e8b-816e-830e15e363b2-client-ca\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.766472 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bc338b-082c-432b-9936-e5756ce65df9-config\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.766498 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d849762-e461-4e8b-816e-830e15e363b2-serving-cert\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.766514 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bc338b-082c-432b-9936-e5756ce65df9-serving-cert\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.766554 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzbq2\" (UniqueName: \"kubernetes.io/projected/6d849762-e461-4e8b-816e-830e15e363b2-kube-api-access-lzbq2\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.766590 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vkb\" (UniqueName: \"kubernetes.io/projected/d7bc338b-082c-432b-9936-e5756ce65df9-kube-api-access-g6vkb\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.766614 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7bc338b-082c-432b-9936-e5756ce65df9-client-ca\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.767500 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d849762-e461-4e8b-816e-830e15e363b2-client-ca\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.767847 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7bc338b-082c-432b-9936-e5756ce65df9-proxy-ca-bundles\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.767899 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d849762-e461-4e8b-816e-830e15e363b2-config\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.767975 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7bc338b-082c-432b-9936-e5756ce65df9-client-ca\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.768162 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7bc338b-082c-432b-9936-e5756ce65df9-proxy-ca-bundles\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.768552 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bc338b-082c-432b-9936-e5756ce65df9-config\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.769110 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d849762-e461-4e8b-816e-830e15e363b2-config\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.775648 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bc338b-082c-432b-9936-e5756ce65df9-serving-cert\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.776942 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d849762-e461-4e8b-816e-830e15e363b2-serving-cert\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.785907 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzbq2\" (UniqueName: \"kubernetes.io/projected/6d849762-e461-4e8b-816e-830e15e363b2-kube-api-access-lzbq2\") pod \"route-controller-manager-5c49bbf8f6-n74cq\" (UID: \"6d849762-e461-4e8b-816e-830e15e363b2\") " pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.788181 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vkb\" (UniqueName: \"kubernetes.io/projected/d7bc338b-082c-432b-9936-e5756ce65df9-kube-api-access-g6vkb\") pod \"controller-manager-7949f545d-wh24t\" (UID: \"d7bc338b-082c-432b-9936-e5756ce65df9\") " pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.791711 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:51 crc kubenswrapper[4780]: I0929 18:55:51.806556 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.027358 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7949f545d-wh24t"] Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.079541 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq"] Sep 29 18:55:52 crc kubenswrapper[4780]: W0929 18:55:52.089007 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d849762_e461_4e8b_816e_830e15e363b2.slice/crio-f602b7657cca18676f9bb24ead24509fb0eac4262a2822df2a1baeaa1bbc5f20 WatchSource:0}: Error finding container f602b7657cca18676f9bb24ead24509fb0eac4262a2822df2a1baeaa1bbc5f20: Status 404 returned error can't find the container with id f602b7657cca18676f9bb24ead24509fb0eac4262a2822df2a1baeaa1bbc5f20 Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.525262 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" event={"ID":"d7bc338b-082c-432b-9936-e5756ce65df9","Type":"ContainerStarted","Data":"47e000bf3f11c1d7fdf14fbfbb764d8f38bd44f98c883b194549b7dff5897fe3"} Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.525318 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" event={"ID":"d7bc338b-082c-432b-9936-e5756ce65df9","Type":"ContainerStarted","Data":"2dd9a7b9ae2869f81f6fd3b61fcbfe77b784f9fe7497c6312e6cd89a519fd4da"} Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.527235 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.533119 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" event={"ID":"6d849762-e461-4e8b-816e-830e15e363b2","Type":"ContainerStarted","Data":"7cc89f7f77f97ae782f720de28eab31f4eb8739dfea6547d51f60272f25cb425"} Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.533151 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" event={"ID":"6d849762-e461-4e8b-816e-830e15e363b2","Type":"ContainerStarted","Data":"f602b7657cca18676f9bb24ead24509fb0eac4262a2822df2a1baeaa1bbc5f20"} Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.533784 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.541714 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.545322 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7949f545d-wh24t" podStartSLOduration=3.545308622 podStartE2EDuration="3.545308622s" podCreationTimestamp="2025-09-29 18:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:55:52.544686704 +0000 UTC m=+752.492984748" watchObservedRunningTime="2025-09-29 18:55:52.545308622 +0000 UTC m=+752.493606666" Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.554808 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" Sep 29 18:55:52 crc kubenswrapper[4780]: I0929 18:55:52.595778 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c49bbf8f6-n74cq" podStartSLOduration=3.595751013 podStartE2EDuration="3.595751013s" podCreationTimestamp="2025-09-29 18:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:55:52.591665283 +0000 UTC m=+752.539963327" watchObservedRunningTime="2025-09-29 18:55:52.595751013 +0000 UTC m=+752.544049057" Sep 29 18:55:53 crc kubenswrapper[4780]: I0929 18:55:53.943383 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn"] Sep 29 18:55:53 crc kubenswrapper[4780]: I0929 18:55:53.945345 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:53 crc kubenswrapper[4780]: I0929 18:55:53.947584 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 18:55:53 crc kubenswrapper[4780]: I0929 18:55:53.956573 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn"] Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.107287 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.107425 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.107454 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rjrg\" (UniqueName: \"kubernetes.io/projected/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-kube-api-access-5rjrg\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.208840 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rjrg\" (UniqueName: \"kubernetes.io/projected/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-kube-api-access-5rjrg\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.208922 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.209013 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.209517 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.209914 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.241066 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rjrg\" (UniqueName: \"kubernetes.io/projected/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-kube-api-access-5rjrg\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.266800 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:55:54 crc kubenswrapper[4780]: I0929 18:55:54.698194 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn"] Sep 29 18:55:55 crc kubenswrapper[4780]: I0929 18:55:55.466348 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-77bv2" podUID="fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" containerName="console" containerID="cri-o://a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7" gracePeriod=15 Sep 29 18:55:55 crc kubenswrapper[4780]: I0929 18:55:55.562500 4780 generic.go:334] "Generic (PLEG): container finished" podID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerID="6d8c82bd464e0391a5cf7095d055ba6997285b24c1dba9faf1b225b2dd6939c5" exitCode=0 Sep 29 18:55:55 crc kubenswrapper[4780]: I0929 18:55:55.562556 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" event={"ID":"fad3af9b-342c-4ae5-b607-5efaaf0a9a05","Type":"ContainerDied","Data":"6d8c82bd464e0391a5cf7095d055ba6997285b24c1dba9faf1b225b2dd6939c5"} Sep 29 18:55:55 crc kubenswrapper[4780]: I0929 18:55:55.562588 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" event={"ID":"fad3af9b-342c-4ae5-b607-5efaaf0a9a05","Type":"ContainerStarted","Data":"91b12dade25563721956f9490fc13364bdbb5eef366690199a6ff453728ce7b1"} Sep 29 18:55:55 crc kubenswrapper[4780]: I0929 18:55:55.851280 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-77bv2_fe9bf3ec-728e-4304-b5f3-9c8e80ec9672/console/0.log" Sep 29 18:55:55 crc kubenswrapper[4780]: I0929 18:55:55.851720 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.039282 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-oauth-config\") pod \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.040869 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-trusted-ca-bundle\") pod \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.040920 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wlx7\" (UniqueName: \"kubernetes.io/projected/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-kube-api-access-9wlx7\") pod \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.040968 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-serving-cert\") pod \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.041019 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-service-ca\") pod \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.041165 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-config\") pod \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.041196 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-oauth-serving-cert\") pod \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\" (UID: \"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672\") " Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.042160 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" (UID: "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.042338 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-config" (OuterVolumeSpecName: "console-config") pod "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" (UID: "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.042409 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-service-ca" (OuterVolumeSpecName: "service-ca") pod "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" (UID: "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.042613 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" (UID: "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.046996 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" (UID: "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.053158 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-kube-api-access-9wlx7" (OuterVolumeSpecName: "kube-api-access-9wlx7") pod "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" (UID: "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672"). InnerVolumeSpecName "kube-api-access-9wlx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.054111 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" (UID: "fe9bf3ec-728e-4304-b5f3-9c8e80ec9672"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.143985 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wlx7\" (UniqueName: \"kubernetes.io/projected/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-kube-api-access-9wlx7\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.144041 4780 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.144085 4780 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-service-ca\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.144103 4780 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.144156 4780 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.144174 4780 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.144190 4780 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.293135 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s92cg"] Sep 29 18:55:56 crc kubenswrapper[4780]: E0929 18:55:56.293383 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" containerName="console" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.293396 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" containerName="console" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.293505 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" containerName="console" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.294270 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.309187 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s92cg"] Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.449129 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-catalog-content\") pod \"redhat-operators-s92cg\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.449205 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwv59\" (UniqueName: \"kubernetes.io/projected/944ee6f8-1f23-49ce-877c-c2093b160862-kube-api-access-nwv59\") pod \"redhat-operators-s92cg\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.449288 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-utilities\") pod \"redhat-operators-s92cg\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.550464 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-utilities\") pod \"redhat-operators-s92cg\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.550549 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-catalog-content\") pod \"redhat-operators-s92cg\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.550578 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv59\" (UniqueName: \"kubernetes.io/projected/944ee6f8-1f23-49ce-877c-c2093b160862-kube-api-access-nwv59\") pod \"redhat-operators-s92cg\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.551187 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-utilities\") pod \"redhat-operators-s92cg\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.551262 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-catalog-content\") pod \"redhat-operators-s92cg\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.565497 4780 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.568786 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwv59\" (UniqueName: \"kubernetes.io/projected/944ee6f8-1f23-49ce-877c-c2093b160862-kube-api-access-nwv59\") pod \"redhat-operators-s92cg\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.570662 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-77bv2_fe9bf3ec-728e-4304-b5f3-9c8e80ec9672/console/0.log" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.570719 4780 generic.go:334] "Generic (PLEG): container finished" podID="fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" containerID="a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7" exitCode=2 Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.570764 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-77bv2" event={"ID":"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672","Type":"ContainerDied","Data":"a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7"} Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.570847 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-77bv2" event={"ID":"fe9bf3ec-728e-4304-b5f3-9c8e80ec9672","Type":"ContainerDied","Data":"c8c0ed1a73c68ff2393b25aac1032e01a6a2f16234a9b1d2b5d51c4626b7c410"} Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.570879 4780 scope.go:117] "RemoveContainer" containerID="a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.570788 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-77bv2" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.600065 4780 scope.go:117] "RemoveContainer" containerID="a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7" Sep 29 18:55:56 crc kubenswrapper[4780]: E0929 18:55:56.600781 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7\": container with ID starting with a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7 not found: ID does not exist" containerID="a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.600844 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7"} err="failed to get container status \"a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7\": rpc error: code = NotFound desc = could not find container \"a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7\": container with ID starting with a9f18ad68126b0a406dbf73f56f999a2d48cfbf217c30516527619508e2ab4f7 not found: ID does not exist" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.611274 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-77bv2"] Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.612339 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.618311 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-77bv2"] Sep 29 18:55:56 crc kubenswrapper[4780]: I0929 18:55:56.806023 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe9bf3ec-728e-4304-b5f3-9c8e80ec9672" path="/var/lib/kubelet/pods/fe9bf3ec-728e-4304-b5f3-9c8e80ec9672/volumes" Sep 29 18:55:57 crc kubenswrapper[4780]: I0929 18:55:57.109987 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s92cg"] Sep 29 18:55:57 crc kubenswrapper[4780]: W0929 18:55:57.114152 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944ee6f8_1f23_49ce_877c_c2093b160862.slice/crio-a7ab2f60f3b2d284accb3e9e12c29e5b3927eb5d0a9a3841e12bb0136e350b9a WatchSource:0}: Error finding container a7ab2f60f3b2d284accb3e9e12c29e5b3927eb5d0a9a3841e12bb0136e350b9a: Status 404 returned error can't find the container with id a7ab2f60f3b2d284accb3e9e12c29e5b3927eb5d0a9a3841e12bb0136e350b9a Sep 29 18:55:57 crc kubenswrapper[4780]: I0929 18:55:57.577218 4780 generic.go:334] "Generic (PLEG): container finished" podID="944ee6f8-1f23-49ce-877c-c2093b160862" containerID="bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640" exitCode=0 Sep 29 18:55:57 crc kubenswrapper[4780]: I0929 18:55:57.577331 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s92cg" event={"ID":"944ee6f8-1f23-49ce-877c-c2093b160862","Type":"ContainerDied","Data":"bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640"} Sep 29 18:55:57 crc kubenswrapper[4780]: I0929 18:55:57.577756 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s92cg" event={"ID":"944ee6f8-1f23-49ce-877c-c2093b160862","Type":"ContainerStarted","Data":"a7ab2f60f3b2d284accb3e9e12c29e5b3927eb5d0a9a3841e12bb0136e350b9a"} Sep 29 18:55:57 crc kubenswrapper[4780]: I0929 18:55:57.581040 4780 generic.go:334] "Generic (PLEG): container finished" podID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerID="4ba51ba764c5f056544441d720df3a61e303506f0e08c2dd6ff2c90d1d1bad6c" exitCode=0 Sep 29 18:55:57 crc kubenswrapper[4780]: I0929 18:55:57.581175 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" event={"ID":"fad3af9b-342c-4ae5-b607-5efaaf0a9a05","Type":"ContainerDied","Data":"4ba51ba764c5f056544441d720df3a61e303506f0e08c2dd6ff2c90d1d1bad6c"} Sep 29 18:55:58 crc kubenswrapper[4780]: I0929 18:55:58.591679 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s92cg" event={"ID":"944ee6f8-1f23-49ce-877c-c2093b160862","Type":"ContainerStarted","Data":"3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae"} Sep 29 18:55:58 crc kubenswrapper[4780]: I0929 18:55:58.594257 4780 generic.go:334] "Generic (PLEG): container finished" podID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerID="4a678aa26db667d3a91d130857b7234db6f1c52fc999a9e1bc9df2f162a36a02" exitCode=0 Sep 29 18:55:58 crc kubenswrapper[4780]: I0929 18:55:58.594309 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" event={"ID":"fad3af9b-342c-4ae5-b607-5efaaf0a9a05","Type":"ContainerDied","Data":"4a678aa26db667d3a91d130857b7234db6f1c52fc999a9e1bc9df2f162a36a02"} Sep 29 18:55:59 crc kubenswrapper[4780]: I0929 18:55:59.603328 4780 generic.go:334] "Generic (PLEG): container finished" podID="944ee6f8-1f23-49ce-877c-c2093b160862" containerID="3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae" exitCode=0 Sep 29 18:55:59 crc kubenswrapper[4780]: I0929 18:55:59.603405 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s92cg" event={"ID":"944ee6f8-1f23-49ce-877c-c2093b160862","Type":"ContainerDied","Data":"3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae"} Sep 29 18:55:59 crc kubenswrapper[4780]: I0929 18:55:59.945809 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.104204 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-util\") pod \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.104303 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rjrg\" (UniqueName: \"kubernetes.io/projected/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-kube-api-access-5rjrg\") pod \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.104350 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-bundle\") pod \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\" (UID: \"fad3af9b-342c-4ae5-b607-5efaaf0a9a05\") " Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.105480 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-bundle" (OuterVolumeSpecName: "bundle") pod "fad3af9b-342c-4ae5-b607-5efaaf0a9a05" (UID: "fad3af9b-342c-4ae5-b607-5efaaf0a9a05"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.111075 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-kube-api-access-5rjrg" (OuterVolumeSpecName: "kube-api-access-5rjrg") pod "fad3af9b-342c-4ae5-b607-5efaaf0a9a05" (UID: "fad3af9b-342c-4ae5-b607-5efaaf0a9a05"). InnerVolumeSpecName "kube-api-access-5rjrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.119700 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-util" (OuterVolumeSpecName: "util") pod "fad3af9b-342c-4ae5-b607-5efaaf0a9a05" (UID: "fad3af9b-342c-4ae5-b607-5efaaf0a9a05"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.205918 4780 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-util\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.205956 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rjrg\" (UniqueName: \"kubernetes.io/projected/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-kube-api-access-5rjrg\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.205970 4780 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fad3af9b-342c-4ae5-b607-5efaaf0a9a05-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.611484 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.613182 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn" event={"ID":"fad3af9b-342c-4ae5-b607-5efaaf0a9a05","Type":"ContainerDied","Data":"91b12dade25563721956f9490fc13364bdbb5eef366690199a6ff453728ce7b1"} Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.613241 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91b12dade25563721956f9490fc13364bdbb5eef366690199a6ff453728ce7b1" Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.616019 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s92cg" event={"ID":"944ee6f8-1f23-49ce-877c-c2093b160862","Type":"ContainerStarted","Data":"e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57"} Sep 29 18:56:00 crc kubenswrapper[4780]: I0929 18:56:00.642300 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s92cg" podStartSLOduration=2.1599935869999998 podStartE2EDuration="4.642278169s" podCreationTimestamp="2025-09-29 18:55:56 +0000 UTC" firstStartedPulling="2025-09-29 18:55:57.578629713 +0000 UTC m=+757.526927757" lastFinishedPulling="2025-09-29 18:56:00.060914295 +0000 UTC m=+760.009212339" observedRunningTime="2025-09-29 18:56:00.640293861 +0000 UTC m=+760.588591905" watchObservedRunningTime="2025-09-29 18:56:00.642278169 +0000 UTC m=+760.590576213" Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.896676 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x66qs"] Sep 29 18:56:01 crc kubenswrapper[4780]: E0929 18:56:01.896993 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerName="extract" Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.897011 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerName="extract" Sep 29 18:56:01 crc kubenswrapper[4780]: E0929 18:56:01.897033 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerName="pull" Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.897064 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerName="pull" Sep 29 18:56:01 crc kubenswrapper[4780]: E0929 18:56:01.897079 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerName="util" Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.897091 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerName="util" Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.897229 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad3af9b-342c-4ae5-b607-5efaaf0a9a05" containerName="extract" Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.898300 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.923880 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x66qs"] Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.928835 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5zlg\" (UniqueName: \"kubernetes.io/projected/dae531ce-d9c1-460b-916e-a0c0e1109ac0-kube-api-access-m5zlg\") pod \"certified-operators-x66qs\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.928910 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-utilities\") pod \"certified-operators-x66qs\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:01 crc kubenswrapper[4780]: I0929 18:56:01.929111 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-catalog-content\") pod \"certified-operators-x66qs\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:02 crc kubenswrapper[4780]: I0929 18:56:02.030863 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5zlg\" (UniqueName: \"kubernetes.io/projected/dae531ce-d9c1-460b-916e-a0c0e1109ac0-kube-api-access-m5zlg\") pod \"certified-operators-x66qs\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:02 crc kubenswrapper[4780]: I0929 18:56:02.031042 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-utilities\") pod \"certified-operators-x66qs\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:02 crc kubenswrapper[4780]: I0929 18:56:02.031170 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-catalog-content\") pod \"certified-operators-x66qs\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:02 crc kubenswrapper[4780]: I0929 18:56:02.031770 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-utilities\") pod \"certified-operators-x66qs\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:02 crc kubenswrapper[4780]: I0929 18:56:02.031992 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-catalog-content\") pod \"certified-operators-x66qs\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:02 crc kubenswrapper[4780]: I0929 18:56:02.055404 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5zlg\" (UniqueName: \"kubernetes.io/projected/dae531ce-d9c1-460b-916e-a0c0e1109ac0-kube-api-access-m5zlg\") pod \"certified-operators-x66qs\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:02 crc kubenswrapper[4780]: I0929 18:56:02.215249 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:02 crc kubenswrapper[4780]: I0929 18:56:02.725627 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x66qs"] Sep 29 18:56:02 crc kubenswrapper[4780]: W0929 18:56:02.732195 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddae531ce_d9c1_460b_916e_a0c0e1109ac0.slice/crio-953e8ba240b9f6bd6c9dfb4e72ec688822e32365f58d60a1dc3e74093c0f9800 WatchSource:0}: Error finding container 953e8ba240b9f6bd6c9dfb4e72ec688822e32365f58d60a1dc3e74093c0f9800: Status 404 returned error can't find the container with id 953e8ba240b9f6bd6c9dfb4e72ec688822e32365f58d60a1dc3e74093c0f9800 Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.223805 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.224491 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.224578 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.225714 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f98e9b0b044c5602c6337b54b65a812f8f898d93f6aec3d809843fc6e333379d"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.225842 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://f98e9b0b044c5602c6337b54b65a812f8f898d93f6aec3d809843fc6e333379d" gracePeriod=600 Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.638753 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="f98e9b0b044c5602c6337b54b65a812f8f898d93f6aec3d809843fc6e333379d" exitCode=0 Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.638829 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"f98e9b0b044c5602c6337b54b65a812f8f898d93f6aec3d809843fc6e333379d"} Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.638869 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"b940a355395049d621b81f1ec2d095c7832b21f04570b0b8f54122a46e556f20"} Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.638887 4780 scope.go:117] "RemoveContainer" containerID="69a2b52b89db4fdab7624cb8dbf5c5bc56e09914aff584a5c943513fc85a4122" Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.642237 4780 generic.go:334] "Generic (PLEG): container finished" podID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerID="5bda73d612e7e6522482db5c42c1548fa48062c5983802fa424b96015e975ce7" exitCode=0 Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.642271 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x66qs" event={"ID":"dae531ce-d9c1-460b-916e-a0c0e1109ac0","Type":"ContainerDied","Data":"5bda73d612e7e6522482db5c42c1548fa48062c5983802fa424b96015e975ce7"} Sep 29 18:56:03 crc kubenswrapper[4780]: I0929 18:56:03.642292 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x66qs" event={"ID":"dae531ce-d9c1-460b-916e-a0c0e1109ac0","Type":"ContainerStarted","Data":"953e8ba240b9f6bd6c9dfb4e72ec688822e32365f58d60a1dc3e74093c0f9800"} Sep 29 18:56:05 crc kubenswrapper[4780]: I0929 18:56:05.668102 4780 generic.go:334] "Generic (PLEG): container finished" podID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerID="0940489896b51859c48441dd25b69713bd60200dd12984ca3c591c1e3b694cc7" exitCode=0 Sep 29 18:56:05 crc kubenswrapper[4780]: I0929 18:56:05.668345 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x66qs" event={"ID":"dae531ce-d9c1-460b-916e-a0c0e1109ac0","Type":"ContainerDied","Data":"0940489896b51859c48441dd25b69713bd60200dd12984ca3c591c1e3b694cc7"} Sep 29 18:56:06 crc kubenswrapper[4780]: I0929 18:56:06.613598 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:56:06 crc kubenswrapper[4780]: I0929 18:56:06.613726 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:56:06 crc kubenswrapper[4780]: I0929 18:56:06.664302 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:56:06 crc kubenswrapper[4780]: I0929 18:56:06.677843 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x66qs" event={"ID":"dae531ce-d9c1-460b-916e-a0c0e1109ac0","Type":"ContainerStarted","Data":"e2f12a93c3f805cde616bd1b87596b14e630071e693168d9cd6698017c859f93"} Sep 29 18:56:06 crc kubenswrapper[4780]: I0929 18:56:06.703644 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x66qs" podStartSLOduration=3.249962265 podStartE2EDuration="5.703621466s" podCreationTimestamp="2025-09-29 18:56:01 +0000 UTC" firstStartedPulling="2025-09-29 18:56:03.645999977 +0000 UTC m=+763.594298021" lastFinishedPulling="2025-09-29 18:56:06.099659168 +0000 UTC m=+766.047957222" observedRunningTime="2025-09-29 18:56:06.701405711 +0000 UTC m=+766.649703775" watchObservedRunningTime="2025-09-29 18:56:06.703621466 +0000 UTC m=+766.651919510" Sep 29 18:56:06 crc kubenswrapper[4780]: I0929 18:56:06.726849 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.085563 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s92cg"] Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.086749 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s92cg" podUID="944ee6f8-1f23-49ce-877c-c2093b160862" containerName="registry-server" containerID="cri-o://e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57" gracePeriod=2 Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.595782 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.711516 4780 generic.go:334] "Generic (PLEG): container finished" podID="944ee6f8-1f23-49ce-877c-c2093b160862" containerID="e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57" exitCode=0 Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.711601 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s92cg" event={"ID":"944ee6f8-1f23-49ce-877c-c2093b160862","Type":"ContainerDied","Data":"e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57"} Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.711643 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s92cg" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.711673 4780 scope.go:117] "RemoveContainer" containerID="e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.711653 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s92cg" event={"ID":"944ee6f8-1f23-49ce-877c-c2093b160862","Type":"ContainerDied","Data":"a7ab2f60f3b2d284accb3e9e12c29e5b3927eb5d0a9a3841e12bb0136e350b9a"} Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.744618 4780 scope.go:117] "RemoveContainer" containerID="3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.771810 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-catalog-content\") pod \"944ee6f8-1f23-49ce-877c-c2093b160862\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.771899 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwv59\" (UniqueName: \"kubernetes.io/projected/944ee6f8-1f23-49ce-877c-c2093b160862-kube-api-access-nwv59\") pod \"944ee6f8-1f23-49ce-877c-c2093b160862\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.771956 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-utilities\") pod \"944ee6f8-1f23-49ce-877c-c2093b160862\" (UID: \"944ee6f8-1f23-49ce-877c-c2093b160862\") " Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.780806 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-utilities" (OuterVolumeSpecName: "utilities") pod "944ee6f8-1f23-49ce-877c-c2093b160862" (UID: "944ee6f8-1f23-49ce-877c-c2093b160862"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.790757 4780 scope.go:117] "RemoveContainer" containerID="bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.798461 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944ee6f8-1f23-49ce-877c-c2093b160862-kube-api-access-nwv59" (OuterVolumeSpecName: "kube-api-access-nwv59") pod "944ee6f8-1f23-49ce-877c-c2093b160862" (UID: "944ee6f8-1f23-49ce-877c-c2093b160862"). InnerVolumeSpecName "kube-api-access-nwv59". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.825411 4780 scope.go:117] "RemoveContainer" containerID="e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57" Sep 29 18:56:10 crc kubenswrapper[4780]: E0929 18:56:10.825868 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57\": container with ID starting with e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57 not found: ID does not exist" containerID="e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.825907 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57"} err="failed to get container status \"e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57\": rpc error: code = NotFound desc = could not find container \"e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57\": container with ID starting with e4826f166e1c1aead6e2a2380020a6acace578997d8613bfc81df5242466aa57 not found: ID does not exist" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.825934 4780 scope.go:117] "RemoveContainer" containerID="3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae" Sep 29 18:56:10 crc kubenswrapper[4780]: E0929 18:56:10.826305 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae\": container with ID starting with 3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae not found: ID does not exist" containerID="3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.826334 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae"} err="failed to get container status \"3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae\": rpc error: code = NotFound desc = could not find container \"3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae\": container with ID starting with 3196fdddac0ecfe3c0e8ea70b11e4fe0b856a79ff12a83332639e8c587237cae not found: ID does not exist" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.826350 4780 scope.go:117] "RemoveContainer" containerID="bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640" Sep 29 18:56:10 crc kubenswrapper[4780]: E0929 18:56:10.826596 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640\": container with ID starting with bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640 not found: ID does not exist" containerID="bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.826620 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640"} err="failed to get container status \"bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640\": rpc error: code = NotFound desc = could not find container \"bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640\": container with ID starting with bb4ad3e7f8684ed7bb81b7b31bbad4928cb4368fa930f3e638e80837d7465640 not found: ID does not exist" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.874920 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "944ee6f8-1f23-49ce-877c-c2093b160862" (UID: "944ee6f8-1f23-49ce-877c-c2093b160862"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.876181 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.876212 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwv59\" (UniqueName: \"kubernetes.io/projected/944ee6f8-1f23-49ce-877c-c2093b160862-kube-api-access-nwv59\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:10 crc kubenswrapper[4780]: I0929 18:56:10.876224 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944ee6f8-1f23-49ce-877c-c2093b160862-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.037541 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s92cg"] Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.044078 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s92cg"] Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.080420 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp"] Sep 29 18:56:11 crc kubenswrapper[4780]: E0929 18:56:11.080765 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944ee6f8-1f23-49ce-877c-c2093b160862" containerName="extract-utilities" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.080791 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="944ee6f8-1f23-49ce-877c-c2093b160862" containerName="extract-utilities" Sep 29 18:56:11 crc kubenswrapper[4780]: E0929 18:56:11.080810 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944ee6f8-1f23-49ce-877c-c2093b160862" containerName="extract-content" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.080820 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="944ee6f8-1f23-49ce-877c-c2093b160862" containerName="extract-content" Sep 29 18:56:11 crc kubenswrapper[4780]: E0929 18:56:11.080833 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944ee6f8-1f23-49ce-877c-c2093b160862" containerName="registry-server" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.080842 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="944ee6f8-1f23-49ce-877c-c2093b160862" containerName="registry-server" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.081011 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="944ee6f8-1f23-49ce-877c-c2093b160862" containerName="registry-server" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.081566 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.086116 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.086469 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.087428 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kznz8" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.087785 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.088261 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.149364 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp"] Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.179451 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce41758d-ee19-4ea0-a3b2-986d2f51d538-webhook-cert\") pod \"metallb-operator-controller-manager-56b5b4fb49-hrxkp\" (UID: \"ce41758d-ee19-4ea0-a3b2-986d2f51d538\") " pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.179521 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce41758d-ee19-4ea0-a3b2-986d2f51d538-apiservice-cert\") pod \"metallb-operator-controller-manager-56b5b4fb49-hrxkp\" (UID: \"ce41758d-ee19-4ea0-a3b2-986d2f51d538\") " pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.179596 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vh8k\" (UniqueName: \"kubernetes.io/projected/ce41758d-ee19-4ea0-a3b2-986d2f51d538-kube-api-access-7vh8k\") pod \"metallb-operator-controller-manager-56b5b4fb49-hrxkp\" (UID: \"ce41758d-ee19-4ea0-a3b2-986d2f51d538\") " pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.281337 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vh8k\" (UniqueName: \"kubernetes.io/projected/ce41758d-ee19-4ea0-a3b2-986d2f51d538-kube-api-access-7vh8k\") pod \"metallb-operator-controller-manager-56b5b4fb49-hrxkp\" (UID: \"ce41758d-ee19-4ea0-a3b2-986d2f51d538\") " pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.281427 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce41758d-ee19-4ea0-a3b2-986d2f51d538-webhook-cert\") pod \"metallb-operator-controller-manager-56b5b4fb49-hrxkp\" (UID: \"ce41758d-ee19-4ea0-a3b2-986d2f51d538\") " pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.281482 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce41758d-ee19-4ea0-a3b2-986d2f51d538-apiservice-cert\") pod \"metallb-operator-controller-manager-56b5b4fb49-hrxkp\" (UID: \"ce41758d-ee19-4ea0-a3b2-986d2f51d538\") " pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.287169 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce41758d-ee19-4ea0-a3b2-986d2f51d538-webhook-cert\") pod \"metallb-operator-controller-manager-56b5b4fb49-hrxkp\" (UID: \"ce41758d-ee19-4ea0-a3b2-986d2f51d538\") " pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.303074 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce41758d-ee19-4ea0-a3b2-986d2f51d538-apiservice-cert\") pod \"metallb-operator-controller-manager-56b5b4fb49-hrxkp\" (UID: \"ce41758d-ee19-4ea0-a3b2-986d2f51d538\") " pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.305062 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vh8k\" (UniqueName: \"kubernetes.io/projected/ce41758d-ee19-4ea0-a3b2-986d2f51d538-kube-api-access-7vh8k\") pod \"metallb-operator-controller-manager-56b5b4fb49-hrxkp\" (UID: \"ce41758d-ee19-4ea0-a3b2-986d2f51d538\") " pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.361112 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9"] Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.362187 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.366570 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-htxbm" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.366582 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.366692 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.382240 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9"] Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.383062 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcx25\" (UniqueName: \"kubernetes.io/projected/94b2302d-6d22-4d24-8186-5707782a3cb6-kube-api-access-lcx25\") pod \"metallb-operator-webhook-server-b5dc64569-rr9r9\" (UID: \"94b2302d-6d22-4d24-8186-5707782a3cb6\") " pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.383145 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94b2302d-6d22-4d24-8186-5707782a3cb6-apiservice-cert\") pod \"metallb-operator-webhook-server-b5dc64569-rr9r9\" (UID: \"94b2302d-6d22-4d24-8186-5707782a3cb6\") " pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.383201 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94b2302d-6d22-4d24-8186-5707782a3cb6-webhook-cert\") pod \"metallb-operator-webhook-server-b5dc64569-rr9r9\" (UID: \"94b2302d-6d22-4d24-8186-5707782a3cb6\") " pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.395667 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.484748 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcx25\" (UniqueName: \"kubernetes.io/projected/94b2302d-6d22-4d24-8186-5707782a3cb6-kube-api-access-lcx25\") pod \"metallb-operator-webhook-server-b5dc64569-rr9r9\" (UID: \"94b2302d-6d22-4d24-8186-5707782a3cb6\") " pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.485510 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94b2302d-6d22-4d24-8186-5707782a3cb6-apiservice-cert\") pod \"metallb-operator-webhook-server-b5dc64569-rr9r9\" (UID: \"94b2302d-6d22-4d24-8186-5707782a3cb6\") " pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.485559 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94b2302d-6d22-4d24-8186-5707782a3cb6-webhook-cert\") pod \"metallb-operator-webhook-server-b5dc64569-rr9r9\" (UID: \"94b2302d-6d22-4d24-8186-5707782a3cb6\") " pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.495089 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94b2302d-6d22-4d24-8186-5707782a3cb6-webhook-cert\") pod \"metallb-operator-webhook-server-b5dc64569-rr9r9\" (UID: \"94b2302d-6d22-4d24-8186-5707782a3cb6\") " pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.497844 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94b2302d-6d22-4d24-8186-5707782a3cb6-apiservice-cert\") pod \"metallb-operator-webhook-server-b5dc64569-rr9r9\" (UID: \"94b2302d-6d22-4d24-8186-5707782a3cb6\") " pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.521118 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcx25\" (UniqueName: \"kubernetes.io/projected/94b2302d-6d22-4d24-8186-5707782a3cb6-kube-api-access-lcx25\") pod \"metallb-operator-webhook-server-b5dc64569-rr9r9\" (UID: \"94b2302d-6d22-4d24-8186-5707782a3cb6\") " pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.679659 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:11 crc kubenswrapper[4780]: I0929 18:56:11.964037 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp"] Sep 29 18:56:11 crc kubenswrapper[4780]: W0929 18:56:11.975031 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce41758d_ee19_4ea0_a3b2_986d2f51d538.slice/crio-0fe0f1068a1a77cf85793eda54400bded04cbb5aabfc15e1edfeb6c5f9182bc5 WatchSource:0}: Error finding container 0fe0f1068a1a77cf85793eda54400bded04cbb5aabfc15e1edfeb6c5f9182bc5: Status 404 returned error can't find the container with id 0fe0f1068a1a77cf85793eda54400bded04cbb5aabfc15e1edfeb6c5f9182bc5 Sep 29 18:56:12 crc kubenswrapper[4780]: I0929 18:56:12.142478 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9"] Sep 29 18:56:12 crc kubenswrapper[4780]: W0929 18:56:12.147981 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94b2302d_6d22_4d24_8186_5707782a3cb6.slice/crio-babe8ce9dff24503eac1b19dde0014acdb45b4a20e4670e1417b476a17f4534e WatchSource:0}: Error finding container babe8ce9dff24503eac1b19dde0014acdb45b4a20e4670e1417b476a17f4534e: Status 404 returned error can't find the container with id babe8ce9dff24503eac1b19dde0014acdb45b4a20e4670e1417b476a17f4534e Sep 29 18:56:12 crc kubenswrapper[4780]: I0929 18:56:12.215689 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:12 crc kubenswrapper[4780]: I0929 18:56:12.215817 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:12 crc kubenswrapper[4780]: I0929 18:56:12.268391 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:12 crc kubenswrapper[4780]: I0929 18:56:12.730548 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" event={"ID":"94b2302d-6d22-4d24-8186-5707782a3cb6","Type":"ContainerStarted","Data":"babe8ce9dff24503eac1b19dde0014acdb45b4a20e4670e1417b476a17f4534e"} Sep 29 18:56:12 crc kubenswrapper[4780]: I0929 18:56:12.731632 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" event={"ID":"ce41758d-ee19-4ea0-a3b2-986d2f51d538","Type":"ContainerStarted","Data":"0fe0f1068a1a77cf85793eda54400bded04cbb5aabfc15e1edfeb6c5f9182bc5"} Sep 29 18:56:12 crc kubenswrapper[4780]: I0929 18:56:12.760266 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="944ee6f8-1f23-49ce-877c-c2093b160862" path="/var/lib/kubelet/pods/944ee6f8-1f23-49ce-877c-c2093b160862/volumes" Sep 29 18:56:12 crc kubenswrapper[4780]: I0929 18:56:12.774322 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.093731 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xggvw"] Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.096754 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.114777 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xggvw"] Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.176626 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-catalog-content\") pod \"redhat-marketplace-xggvw\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.176680 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6cvx\" (UniqueName: \"kubernetes.io/projected/9a87212a-6c9f-47dc-90ee-5a8961c2c686-kube-api-access-j6cvx\") pod \"redhat-marketplace-xggvw\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.176712 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-utilities\") pod \"redhat-marketplace-xggvw\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.277798 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-catalog-content\") pod \"redhat-marketplace-xggvw\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.277867 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6cvx\" (UniqueName: \"kubernetes.io/projected/9a87212a-6c9f-47dc-90ee-5a8961c2c686-kube-api-access-j6cvx\") pod \"redhat-marketplace-xggvw\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.277910 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-utilities\") pod \"redhat-marketplace-xggvw\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.279139 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-utilities\") pod \"redhat-marketplace-xggvw\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.279315 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-catalog-content\") pod \"redhat-marketplace-xggvw\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.302686 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6cvx\" (UniqueName: \"kubernetes.io/projected/9a87212a-6c9f-47dc-90ee-5a8961c2c686-kube-api-access-j6cvx\") pod \"redhat-marketplace-xggvw\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.459294 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.895892 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x66qs"] Sep 29 18:56:16 crc kubenswrapper[4780]: I0929 18:56:16.896310 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x66qs" podUID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerName="registry-server" containerID="cri-o://e2f12a93c3f805cde616bd1b87596b14e630071e693168d9cd6698017c859f93" gracePeriod=2 Sep 29 18:56:17 crc kubenswrapper[4780]: I0929 18:56:17.818333 4780 generic.go:334] "Generic (PLEG): container finished" podID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerID="e2f12a93c3f805cde616bd1b87596b14e630071e693168d9cd6698017c859f93" exitCode=0 Sep 29 18:56:17 crc kubenswrapper[4780]: I0929 18:56:17.818905 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x66qs" event={"ID":"dae531ce-d9c1-460b-916e-a0c0e1109ac0","Type":"ContainerDied","Data":"e2f12a93c3f805cde616bd1b87596b14e630071e693168d9cd6698017c859f93"} Sep 29 18:56:17 crc kubenswrapper[4780]: I0929 18:56:17.913776 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.008487 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-utilities\") pod \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.008618 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-catalog-content\") pod \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.008663 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5zlg\" (UniqueName: \"kubernetes.io/projected/dae531ce-d9c1-460b-916e-a0c0e1109ac0-kube-api-access-m5zlg\") pod \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\" (UID: \"dae531ce-d9c1-460b-916e-a0c0e1109ac0\") " Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.009927 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-utilities" (OuterVolumeSpecName: "utilities") pod "dae531ce-d9c1-460b-916e-a0c0e1109ac0" (UID: "dae531ce-d9c1-460b-916e-a0c0e1109ac0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.017173 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae531ce-d9c1-460b-916e-a0c0e1109ac0-kube-api-access-m5zlg" (OuterVolumeSpecName: "kube-api-access-m5zlg") pod "dae531ce-d9c1-460b-916e-a0c0e1109ac0" (UID: "dae531ce-d9c1-460b-916e-a0c0e1109ac0"). InnerVolumeSpecName "kube-api-access-m5zlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.062711 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dae531ce-d9c1-460b-916e-a0c0e1109ac0" (UID: "dae531ce-d9c1-460b-916e-a0c0e1109ac0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.083583 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xggvw"] Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.110186 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.110223 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae531ce-d9c1-460b-916e-a0c0e1109ac0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.110241 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5zlg\" (UniqueName: \"kubernetes.io/projected/dae531ce-d9c1-460b-916e-a0c0e1109ac0-kube-api-access-m5zlg\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.826701 4780 generic.go:334] "Generic (PLEG): container finished" podID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerID="aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99" exitCode=0 Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.826815 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xggvw" event={"ID":"9a87212a-6c9f-47dc-90ee-5a8961c2c686","Type":"ContainerDied","Data":"aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99"} Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.826867 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xggvw" event={"ID":"9a87212a-6c9f-47dc-90ee-5a8961c2c686","Type":"ContainerStarted","Data":"3e59f19990b69408e724794983b2a9cdfbc3f485a55712fb07aab50c87676a45"} Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.829105 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" event={"ID":"94b2302d-6d22-4d24-8186-5707782a3cb6","Type":"ContainerStarted","Data":"ed346019fe41bf1c4cc2b065bce023e3d82e220a41e8efe28e22b7ce47765f92"} Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.829176 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.832902 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x66qs" event={"ID":"dae531ce-d9c1-460b-916e-a0c0e1109ac0","Type":"ContainerDied","Data":"953e8ba240b9f6bd6c9dfb4e72ec688822e32365f58d60a1dc3e74093c0f9800"} Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.832988 4780 scope.go:117] "RemoveContainer" containerID="e2f12a93c3f805cde616bd1b87596b14e630071e693168d9cd6698017c859f93" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.832934 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x66qs" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.835919 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" event={"ID":"ce41758d-ee19-4ea0-a3b2-986d2f51d538","Type":"ContainerStarted","Data":"7c271be38841969e442fba6dff9e98444348f8cc27b800eac88ff68742249f2c"} Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.836154 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.865350 4780 scope.go:117] "RemoveContainer" containerID="0940489896b51859c48441dd25b69713bd60200dd12984ca3c591c1e3b694cc7" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.866297 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x66qs"] Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.873792 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x66qs"] Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.888918 4780 scope.go:117] "RemoveContainer" containerID="5bda73d612e7e6522482db5c42c1548fa48062c5983802fa424b96015e975ce7" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.892668 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" podStartSLOduration=2.50994935 podStartE2EDuration="7.892644177s" podCreationTimestamp="2025-09-29 18:56:11 +0000 UTC" firstStartedPulling="2025-09-29 18:56:12.152233997 +0000 UTC m=+772.100532041" lastFinishedPulling="2025-09-29 18:56:17.534928824 +0000 UTC m=+777.483226868" observedRunningTime="2025-09-29 18:56:18.890858677 +0000 UTC m=+778.839156731" watchObservedRunningTime="2025-09-29 18:56:18.892644177 +0000 UTC m=+778.840942221" Sep 29 18:56:18 crc kubenswrapper[4780]: I0929 18:56:18.922226 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" podStartSLOduration=2.3712329580000002 podStartE2EDuration="7.922197344s" podCreationTimestamp="2025-09-29 18:56:11 +0000 UTC" firstStartedPulling="2025-09-29 18:56:11.977471697 +0000 UTC m=+771.925769741" lastFinishedPulling="2025-09-29 18:56:17.528436083 +0000 UTC m=+777.476734127" observedRunningTime="2025-09-29 18:56:18.915061717 +0000 UTC m=+778.863359761" watchObservedRunningTime="2025-09-29 18:56:18.922197344 +0000 UTC m=+778.870495388" Sep 29 18:56:19 crc kubenswrapper[4780]: I0929 18:56:19.844883 4780 generic.go:334] "Generic (PLEG): container finished" podID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerID="0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b" exitCode=0 Sep 29 18:56:19 crc kubenswrapper[4780]: I0929 18:56:19.845072 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xggvw" event={"ID":"9a87212a-6c9f-47dc-90ee-5a8961c2c686","Type":"ContainerDied","Data":"0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b"} Sep 29 18:56:20 crc kubenswrapper[4780]: I0929 18:56:20.769659 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" path="/var/lib/kubelet/pods/dae531ce-d9c1-460b-916e-a0c0e1109ac0/volumes" Sep 29 18:56:20 crc kubenswrapper[4780]: I0929 18:56:20.855397 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xggvw" event={"ID":"9a87212a-6c9f-47dc-90ee-5a8961c2c686","Type":"ContainerStarted","Data":"7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c"} Sep 29 18:56:20 crc kubenswrapper[4780]: I0929 18:56:20.885356 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xggvw" podStartSLOduration=3.448440153 podStartE2EDuration="4.885336167s" podCreationTimestamp="2025-09-29 18:56:16 +0000 UTC" firstStartedPulling="2025-09-29 18:56:18.82808017 +0000 UTC m=+778.776378214" lastFinishedPulling="2025-09-29 18:56:20.264976184 +0000 UTC m=+780.213274228" observedRunningTime="2025-09-29 18:56:20.883032193 +0000 UTC m=+780.831330237" watchObservedRunningTime="2025-09-29 18:56:20.885336167 +0000 UTC m=+780.833634211" Sep 29 18:56:26 crc kubenswrapper[4780]: I0929 18:56:26.460003 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:26 crc kubenswrapper[4780]: I0929 18:56:26.460603 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:26 crc kubenswrapper[4780]: I0929 18:56:26.534325 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:26 crc kubenswrapper[4780]: I0929 18:56:26.943262 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:28 crc kubenswrapper[4780]: I0929 18:56:28.284216 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xggvw"] Sep 29 18:56:28 crc kubenswrapper[4780]: I0929 18:56:28.901397 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xggvw" podUID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerName="registry-server" containerID="cri-o://7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c" gracePeriod=2 Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.314280 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.496535 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-catalog-content\") pod \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.496598 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6cvx\" (UniqueName: \"kubernetes.io/projected/9a87212a-6c9f-47dc-90ee-5a8961c2c686-kube-api-access-j6cvx\") pod \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.496841 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-utilities\") pod \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\" (UID: \"9a87212a-6c9f-47dc-90ee-5a8961c2c686\") " Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.498286 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-utilities" (OuterVolumeSpecName: "utilities") pod "9a87212a-6c9f-47dc-90ee-5a8961c2c686" (UID: "9a87212a-6c9f-47dc-90ee-5a8961c2c686"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.504505 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a87212a-6c9f-47dc-90ee-5a8961c2c686-kube-api-access-j6cvx" (OuterVolumeSpecName: "kube-api-access-j6cvx") pod "9a87212a-6c9f-47dc-90ee-5a8961c2c686" (UID: "9a87212a-6c9f-47dc-90ee-5a8961c2c686"). InnerVolumeSpecName "kube-api-access-j6cvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.511397 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a87212a-6c9f-47dc-90ee-5a8961c2c686" (UID: "9a87212a-6c9f-47dc-90ee-5a8961c2c686"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.599226 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.599274 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a87212a-6c9f-47dc-90ee-5a8961c2c686-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.599292 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6cvx\" (UniqueName: \"kubernetes.io/projected/9a87212a-6c9f-47dc-90ee-5a8961c2c686-kube-api-access-j6cvx\") on node \"crc\" DevicePath \"\"" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.909890 4780 generic.go:334] "Generic (PLEG): container finished" podID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerID="7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c" exitCode=0 Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.909953 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xggvw" event={"ID":"9a87212a-6c9f-47dc-90ee-5a8961c2c686","Type":"ContainerDied","Data":"7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c"} Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.909975 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xggvw" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.909997 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xggvw" event={"ID":"9a87212a-6c9f-47dc-90ee-5a8961c2c686","Type":"ContainerDied","Data":"3e59f19990b69408e724794983b2a9cdfbc3f485a55712fb07aab50c87676a45"} Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.910030 4780 scope.go:117] "RemoveContainer" containerID="7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.940286 4780 scope.go:117] "RemoveContainer" containerID="0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b" Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.967828 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xggvw"] Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.978963 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xggvw"] Sep 29 18:56:29 crc kubenswrapper[4780]: I0929 18:56:29.986686 4780 scope.go:117] "RemoveContainer" containerID="aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99" Sep 29 18:56:30 crc kubenswrapper[4780]: I0929 18:56:30.017518 4780 scope.go:117] "RemoveContainer" containerID="7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c" Sep 29 18:56:30 crc kubenswrapper[4780]: E0929 18:56:30.024277 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c\": container with ID starting with 7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c not found: ID does not exist" containerID="7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c" Sep 29 18:56:30 crc kubenswrapper[4780]: I0929 18:56:30.024320 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c"} err="failed to get container status \"7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c\": rpc error: code = NotFound desc = could not find container \"7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c\": container with ID starting with 7bc627db488a811373c2161f6e62ba636e54776ccc51afaabcb665261d8e799c not found: ID does not exist" Sep 29 18:56:30 crc kubenswrapper[4780]: I0929 18:56:30.024355 4780 scope.go:117] "RemoveContainer" containerID="0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b" Sep 29 18:56:30 crc kubenswrapper[4780]: E0929 18:56:30.027197 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b\": container with ID starting with 0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b not found: ID does not exist" containerID="0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b" Sep 29 18:56:30 crc kubenswrapper[4780]: I0929 18:56:30.027260 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b"} err="failed to get container status \"0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b\": rpc error: code = NotFound desc = could not find container \"0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b\": container with ID starting with 0e9cab53fac06658cb7f953751d7a4fd3b291ab786218d7ba88cd6173425bf7b not found: ID does not exist" Sep 29 18:56:30 crc kubenswrapper[4780]: I0929 18:56:30.027297 4780 scope.go:117] "RemoveContainer" containerID="aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99" Sep 29 18:56:30 crc kubenswrapper[4780]: E0929 18:56:30.031193 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99\": container with ID starting with aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99 not found: ID does not exist" containerID="aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99" Sep 29 18:56:30 crc kubenswrapper[4780]: I0929 18:56:30.031244 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99"} err="failed to get container status \"aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99\": rpc error: code = NotFound desc = could not find container \"aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99\": container with ID starting with aca5943be0fccd737ccc9863331d89f855e95f2926f84f9ba68c6678c5fe4d99 not found: ID does not exist" Sep 29 18:56:30 crc kubenswrapper[4780]: I0929 18:56:30.759826 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" path="/var/lib/kubelet/pods/9a87212a-6c9f-47dc-90ee-5a8961c2c686/volumes" Sep 29 18:56:31 crc kubenswrapper[4780]: I0929 18:56:31.686237 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-b5dc64569-rr9r9" Sep 29 18:56:51 crc kubenswrapper[4780]: I0929 18:56:51.400513 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-56b5b4fb49-hrxkp" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.112724 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-flc8q"] Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.112991 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerName="registry-server" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.113006 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerName="registry-server" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.113021 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerName="registry-server" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.113027 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerName="registry-server" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.113037 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerName="extract-utilities" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.113060 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerName="extract-utilities" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.113072 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerName="extract-content" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.113078 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerName="extract-content" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.113088 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerName="extract-content" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.113093 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerName="extract-content" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.113103 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerName="extract-utilities" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.113109 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerName="extract-utilities" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.113234 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a87212a-6c9f-47dc-90ee-5a8961c2c686" containerName="registry-server" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.113246 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae531ce-d9c1-460b-916e-a0c0e1109ac0" containerName="registry-server" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.115307 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.118889 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.118946 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.118962 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-m4ltl" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.123829 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr"] Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.124829 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.128140 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.139722 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr"] Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.215938 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-4w6qq"] Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.216876 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.218729 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.219553 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.219960 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.220216 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-w7hnp" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.223924 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-669g9\" (UniqueName: \"kubernetes.io/projected/5ab4aafd-d570-4892-ac9c-75a4cb18b1af-kube-api-access-669g9\") pod \"frr-k8s-webhook-server-5478bdb765-zwqlr\" (UID: \"5ab4aafd-d570-4892-ac9c-75a4cb18b1af\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.223964 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-frr-conf\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.223989 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-reloader\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.224005 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-frr-sockets\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.224024 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/390f974b-7808-487b-95b8-b72df5367294-metrics-certs\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.224075 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-metrics\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.224094 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ssrt\" (UniqueName: \"kubernetes.io/projected/390f974b-7808-487b-95b8-b72df5367294-kube-api-access-2ssrt\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.224126 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ab4aafd-d570-4892-ac9c-75a4cb18b1af-cert\") pod \"frr-k8s-webhook-server-5478bdb765-zwqlr\" (UID: \"5ab4aafd-d570-4892-ac9c-75a4cb18b1af\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.224159 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/390f974b-7808-487b-95b8-b72df5367294-frr-startup\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.257588 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-zchw7"] Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.260549 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.264861 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.285307 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-zchw7"] Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.325687 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-metrics-certs\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.325805 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-reloader\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.325890 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-frr-sockets\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.325918 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/390f974b-7808-487b-95b8-b72df5367294-metrics-certs\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.325970 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgvhw\" (UniqueName: \"kubernetes.io/projected/758227bc-a5d7-4889-a3fa-0b97b5212c6f-kube-api-access-lgvhw\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.326008 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-metrics\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.326054 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ssrt\" (UniqueName: \"kubernetes.io/projected/390f974b-7808-487b-95b8-b72df5367294-kube-api-access-2ssrt\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.326074 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/758227bc-a5d7-4889-a3fa-0b97b5212c6f-metallb-excludel2\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.326134 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ab4aafd-d570-4892-ac9c-75a4cb18b1af-cert\") pod \"frr-k8s-webhook-server-5478bdb765-zwqlr\" (UID: \"5ab4aafd-d570-4892-ac9c-75a4cb18b1af\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.326159 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-memberlist\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.326205 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/390f974b-7808-487b-95b8-b72df5367294-frr-startup\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.326236 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-669g9\" (UniqueName: \"kubernetes.io/projected/5ab4aafd-d570-4892-ac9c-75a4cb18b1af-kube-api-access-669g9\") pod \"frr-k8s-webhook-server-5478bdb765-zwqlr\" (UID: \"5ab4aafd-d570-4892-ac9c-75a4cb18b1af\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.326286 4780 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.326260 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-frr-conf\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.326401 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/390f974b-7808-487b-95b8-b72df5367294-metrics-certs podName:390f974b-7808-487b-95b8-b72df5367294 nodeName:}" failed. No retries permitted until 2025-09-29 18:56:52.826370834 +0000 UTC m=+812.774669068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/390f974b-7808-487b-95b8-b72df5367294-metrics-certs") pod "frr-k8s-flc8q" (UID: "390f974b-7808-487b-95b8-b72df5367294") : secret "frr-k8s-certs-secret" not found Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.326754 4780 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.326795 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ab4aafd-d570-4892-ac9c-75a4cb18b1af-cert podName:5ab4aafd-d570-4892-ac9c-75a4cb18b1af nodeName:}" failed. No retries permitted until 2025-09-29 18:56:52.826784915 +0000 UTC m=+812.775083189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ab4aafd-d570-4892-ac9c-75a4cb18b1af-cert") pod "frr-k8s-webhook-server-5478bdb765-zwqlr" (UID: "5ab4aafd-d570-4892-ac9c-75a4cb18b1af") : secret "frr-k8s-webhook-server-cert" not found Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.327127 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-metrics\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.327144 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-frr-sockets\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.327319 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-reloader\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.327610 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/390f974b-7808-487b-95b8-b72df5367294-frr-conf\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.328386 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/390f974b-7808-487b-95b8-b72df5367294-frr-startup\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.352205 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-669g9\" (UniqueName: \"kubernetes.io/projected/5ab4aafd-d570-4892-ac9c-75a4cb18b1af-kube-api-access-669g9\") pod \"frr-k8s-webhook-server-5478bdb765-zwqlr\" (UID: \"5ab4aafd-d570-4892-ac9c-75a4cb18b1af\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.367683 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ssrt\" (UniqueName: \"kubernetes.io/projected/390f974b-7808-487b-95b8-b72df5367294-kube-api-access-2ssrt\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.427773 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-memberlist\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.427881 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-metrics-certs\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.427922 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg6gh\" (UniqueName: \"kubernetes.io/projected/dd6a4e23-fac8-4a60-8aed-541962416e4a-kube-api-access-rg6gh\") pod \"controller-5d688f5ffc-zchw7\" (UID: \"dd6a4e23-fac8-4a60-8aed-541962416e4a\") " pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.427980 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgvhw\" (UniqueName: \"kubernetes.io/projected/758227bc-a5d7-4889-a3fa-0b97b5212c6f-kube-api-access-lgvhw\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.428001 4780 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.428115 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-memberlist podName:758227bc-a5d7-4889-a3fa-0b97b5212c6f nodeName:}" failed. No retries permitted until 2025-09-29 18:56:52.928091118 +0000 UTC m=+812.876389162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-memberlist") pod "speaker-4w6qq" (UID: "758227bc-a5d7-4889-a3fa-0b97b5212c6f") : secret "metallb-memberlist" not found Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.428012 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd6a4e23-fac8-4a60-8aed-541962416e4a-metrics-certs\") pod \"controller-5d688f5ffc-zchw7\" (UID: \"dd6a4e23-fac8-4a60-8aed-541962416e4a\") " pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.428190 4780 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.428277 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd6a4e23-fac8-4a60-8aed-541962416e4a-cert\") pod \"controller-5d688f5ffc-zchw7\" (UID: \"dd6a4e23-fac8-4a60-8aed-541962416e4a\") " pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.428289 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-metrics-certs podName:758227bc-a5d7-4889-a3fa-0b97b5212c6f nodeName:}" failed. No retries permitted until 2025-09-29 18:56:52.928260002 +0000 UTC m=+812.876558046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-metrics-certs") pod "speaker-4w6qq" (UID: "758227bc-a5d7-4889-a3fa-0b97b5212c6f") : secret "speaker-certs-secret" not found Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.428392 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/758227bc-a5d7-4889-a3fa-0b97b5212c6f-metallb-excludel2\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.429187 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/758227bc-a5d7-4889-a3fa-0b97b5212c6f-metallb-excludel2\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.445719 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgvhw\" (UniqueName: \"kubernetes.io/projected/758227bc-a5d7-4889-a3fa-0b97b5212c6f-kube-api-access-lgvhw\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.529797 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg6gh\" (UniqueName: \"kubernetes.io/projected/dd6a4e23-fac8-4a60-8aed-541962416e4a-kube-api-access-rg6gh\") pod \"controller-5d688f5ffc-zchw7\" (UID: \"dd6a4e23-fac8-4a60-8aed-541962416e4a\") " pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.529966 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd6a4e23-fac8-4a60-8aed-541962416e4a-metrics-certs\") pod \"controller-5d688f5ffc-zchw7\" (UID: \"dd6a4e23-fac8-4a60-8aed-541962416e4a\") " pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.530005 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd6a4e23-fac8-4a60-8aed-541962416e4a-cert\") pod \"controller-5d688f5ffc-zchw7\" (UID: \"dd6a4e23-fac8-4a60-8aed-541962416e4a\") " pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.532604 4780 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.533851 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd6a4e23-fac8-4a60-8aed-541962416e4a-metrics-certs\") pod \"controller-5d688f5ffc-zchw7\" (UID: \"dd6a4e23-fac8-4a60-8aed-541962416e4a\") " pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.545879 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd6a4e23-fac8-4a60-8aed-541962416e4a-cert\") pod \"controller-5d688f5ffc-zchw7\" (UID: \"dd6a4e23-fac8-4a60-8aed-541962416e4a\") " pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.555730 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg6gh\" (UniqueName: \"kubernetes.io/projected/dd6a4e23-fac8-4a60-8aed-541962416e4a-kube-api-access-rg6gh\") pod \"controller-5d688f5ffc-zchw7\" (UID: \"dd6a4e23-fac8-4a60-8aed-541962416e4a\") " pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.591740 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.834189 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ab4aafd-d570-4892-ac9c-75a4cb18b1af-cert\") pod \"frr-k8s-webhook-server-5478bdb765-zwqlr\" (UID: \"5ab4aafd-d570-4892-ac9c-75a4cb18b1af\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.834790 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/390f974b-7808-487b-95b8-b72df5367294-metrics-certs\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.839227 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/390f974b-7808-487b-95b8-b72df5367294-metrics-certs\") pod \"frr-k8s-flc8q\" (UID: \"390f974b-7808-487b-95b8-b72df5367294\") " pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.839549 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ab4aafd-d570-4892-ac9c-75a4cb18b1af-cert\") pod \"frr-k8s-webhook-server-5478bdb765-zwqlr\" (UID: \"5ab4aafd-d570-4892-ac9c-75a4cb18b1af\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.936727 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-memberlist\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.936821 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-metrics-certs\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.936914 4780 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 29 18:56:52 crc kubenswrapper[4780]: E0929 18:56:52.937016 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-memberlist podName:758227bc-a5d7-4889-a3fa-0b97b5212c6f nodeName:}" failed. No retries permitted until 2025-09-29 18:56:53.936988927 +0000 UTC m=+813.885287041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-memberlist") pod "speaker-4w6qq" (UID: "758227bc-a5d7-4889-a3fa-0b97b5212c6f") : secret "metallb-memberlist" not found Sep 29 18:56:52 crc kubenswrapper[4780]: I0929 18:56:52.940934 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-metrics-certs\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:53 crc kubenswrapper[4780]: I0929 18:56:53.003076 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-zchw7"] Sep 29 18:56:53 crc kubenswrapper[4780]: I0929 18:56:53.036913 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-flc8q" Sep 29 18:56:53 crc kubenswrapper[4780]: I0929 18:56:53.044442 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:56:53 crc kubenswrapper[4780]: I0929 18:56:53.081984 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-zchw7" event={"ID":"dd6a4e23-fac8-4a60-8aed-541962416e4a","Type":"ContainerStarted","Data":"c2c2b9318cd77284292082ac584e6d598983c5220b562dea2f556a6076419fa0"} Sep 29 18:56:53 crc kubenswrapper[4780]: I0929 18:56:53.494501 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr"] Sep 29 18:56:53 crc kubenswrapper[4780]: W0929 18:56:53.502701 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ab4aafd_d570_4892_ac9c_75a4cb18b1af.slice/crio-5d8d5a594dea98cc977f04f0b8e8527786b14ef3f60fbf3c987ee95e2e29a164 WatchSource:0}: Error finding container 5d8d5a594dea98cc977f04f0b8e8527786b14ef3f60fbf3c987ee95e2e29a164: Status 404 returned error can't find the container with id 5d8d5a594dea98cc977f04f0b8e8527786b14ef3f60fbf3c987ee95e2e29a164 Sep 29 18:56:53 crc kubenswrapper[4780]: I0929 18:56:53.952263 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-memberlist\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:53 crc kubenswrapper[4780]: I0929 18:56:53.959648 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/758227bc-a5d7-4889-a3fa-0b97b5212c6f-memberlist\") pod \"speaker-4w6qq\" (UID: \"758227bc-a5d7-4889-a3fa-0b97b5212c6f\") " pod="metallb-system/speaker-4w6qq" Sep 29 18:56:54 crc kubenswrapper[4780]: I0929 18:56:54.030116 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4w6qq" Sep 29 18:56:54 crc kubenswrapper[4780]: W0929 18:56:54.054026 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod758227bc_a5d7_4889_a3fa_0b97b5212c6f.slice/crio-45c9a6fe9b28ca019fa0f12493f99ee34d48f3d394cc4818ce9776bd3eefc15a WatchSource:0}: Error finding container 45c9a6fe9b28ca019fa0f12493f99ee34d48f3d394cc4818ce9776bd3eefc15a: Status 404 returned error can't find the container with id 45c9a6fe9b28ca019fa0f12493f99ee34d48f3d394cc4818ce9776bd3eefc15a Sep 29 18:56:54 crc kubenswrapper[4780]: I0929 18:56:54.100489 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" event={"ID":"5ab4aafd-d570-4892-ac9c-75a4cb18b1af","Type":"ContainerStarted","Data":"5d8d5a594dea98cc977f04f0b8e8527786b14ef3f60fbf3c987ee95e2e29a164"} Sep 29 18:56:54 crc kubenswrapper[4780]: I0929 18:56:54.111380 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-zchw7" event={"ID":"dd6a4e23-fac8-4a60-8aed-541962416e4a","Type":"ContainerStarted","Data":"1b2f06d29a2edd4d8a70f3eab0dd514fe9ae50fcfe0a9b4b9680b9a1c54ded38"} Sep 29 18:56:54 crc kubenswrapper[4780]: I0929 18:56:54.111437 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-zchw7" event={"ID":"dd6a4e23-fac8-4a60-8aed-541962416e4a","Type":"ContainerStarted","Data":"c461a41c38db73a190682517d9db711e6a0d8009946f85186c22eabb512e3b61"} Sep 29 18:56:54 crc kubenswrapper[4780]: I0929 18:56:54.111510 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:56:54 crc kubenswrapper[4780]: I0929 18:56:54.118038 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4w6qq" event={"ID":"758227bc-a5d7-4889-a3fa-0b97b5212c6f","Type":"ContainerStarted","Data":"45c9a6fe9b28ca019fa0f12493f99ee34d48f3d394cc4818ce9776bd3eefc15a"} Sep 29 18:56:54 crc kubenswrapper[4780]: I0929 18:56:54.129109 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerStarted","Data":"3f98f2b76922afa4bb18e7db0ddbe500468f93f6ba83f68eead1ee9da8b9dc2e"} Sep 29 18:56:54 crc kubenswrapper[4780]: I0929 18:56:54.149402 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-zchw7" podStartSLOduration=2.149384769 podStartE2EDuration="2.149384769s" podCreationTimestamp="2025-09-29 18:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:56:54.145593194 +0000 UTC m=+814.093891238" watchObservedRunningTime="2025-09-29 18:56:54.149384769 +0000 UTC m=+814.097682813" Sep 29 18:56:55 crc kubenswrapper[4780]: I0929 18:56:55.144985 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4w6qq" event={"ID":"758227bc-a5d7-4889-a3fa-0b97b5212c6f","Type":"ContainerStarted","Data":"7df3e41a093ebd888ebd8e5686b832ab267bb3dacfcf1d43dd0b6339181e7b23"} Sep 29 18:56:55 crc kubenswrapper[4780]: I0929 18:56:55.145437 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4w6qq" event={"ID":"758227bc-a5d7-4889-a3fa-0b97b5212c6f","Type":"ContainerStarted","Data":"fbb601f8ee7d8ba26c9bf03d599e49fcf29654b4126f026c8cc79e42aab3ba35"} Sep 29 18:56:55 crc kubenswrapper[4780]: I0929 18:56:55.167767 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-4w6qq" podStartSLOduration=3.167743073 podStartE2EDuration="3.167743073s" podCreationTimestamp="2025-09-29 18:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:56:55.162003834 +0000 UTC m=+815.110301878" watchObservedRunningTime="2025-09-29 18:56:55.167743073 +0000 UTC m=+815.116041117" Sep 29 18:56:56 crc kubenswrapper[4780]: I0929 18:56:56.151940 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-4w6qq" Sep 29 18:57:01 crc kubenswrapper[4780]: I0929 18:57:01.191662 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" event={"ID":"5ab4aafd-d570-4892-ac9c-75a4cb18b1af","Type":"ContainerStarted","Data":"519d02b71743d4424c6ccd6ea9255ac7acb12fa3d0b560cf4fb1d7a1f1571058"} Sep 29 18:57:01 crc kubenswrapper[4780]: I0929 18:57:01.192618 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:57:01 crc kubenswrapper[4780]: I0929 18:57:01.193991 4780 generic.go:334] "Generic (PLEG): container finished" podID="390f974b-7808-487b-95b8-b72df5367294" containerID="47d357ba58096e71520b371e9758db899f031c382773c969635f31012c7dcb87" exitCode=0 Sep 29 18:57:01 crc kubenswrapper[4780]: I0929 18:57:01.194120 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerDied","Data":"47d357ba58096e71520b371e9758db899f031c382773c969635f31012c7dcb87"} Sep 29 18:57:01 crc kubenswrapper[4780]: I0929 18:57:01.213176 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" podStartSLOduration=2.708886618 podStartE2EDuration="9.213141724s" podCreationTimestamp="2025-09-29 18:56:52 +0000 UTC" firstStartedPulling="2025-09-29 18:56:53.505606329 +0000 UTC m=+813.453904373" lastFinishedPulling="2025-09-29 18:57:00.009861435 +0000 UTC m=+819.958159479" observedRunningTime="2025-09-29 18:57:01.212185238 +0000 UTC m=+821.160483292" watchObservedRunningTime="2025-09-29 18:57:01.213141724 +0000 UTC m=+821.161439788" Sep 29 18:57:01 crc kubenswrapper[4780]: I0929 18:57:01.934919 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2c8nm"] Sep 29 18:57:01 crc kubenswrapper[4780]: I0929 18:57:01.937931 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:01 crc kubenswrapper[4780]: I0929 18:57:01.948247 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2c8nm"] Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.089499 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-catalog-content\") pod \"community-operators-2c8nm\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.089591 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-utilities\") pod \"community-operators-2c8nm\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.089635 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc49k\" (UniqueName: \"kubernetes.io/projected/7ed62f37-b35d-477a-b70c-447698bd1b22-kube-api-access-nc49k\") pod \"community-operators-2c8nm\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.191819 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-catalog-content\") pod \"community-operators-2c8nm\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.191890 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-utilities\") pod \"community-operators-2c8nm\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.191929 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc49k\" (UniqueName: \"kubernetes.io/projected/7ed62f37-b35d-477a-b70c-447698bd1b22-kube-api-access-nc49k\") pod \"community-operators-2c8nm\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.192566 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-catalog-content\") pod \"community-operators-2c8nm\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.192645 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-utilities\") pod \"community-operators-2c8nm\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.203405 4780 generic.go:334] "Generic (PLEG): container finished" podID="390f974b-7808-487b-95b8-b72df5367294" containerID="7c4bcff7f4c1d53d82c0fd67bfefb0b927c93764c9760ec8a920d01706954efe" exitCode=0 Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.203487 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerDied","Data":"7c4bcff7f4c1d53d82c0fd67bfefb0b927c93764c9760ec8a920d01706954efe"} Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.236601 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc49k\" (UniqueName: \"kubernetes.io/projected/7ed62f37-b35d-477a-b70c-447698bd1b22-kube-api-access-nc49k\") pod \"community-operators-2c8nm\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.267608 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:02 crc kubenswrapper[4780]: I0929 18:57:02.798836 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2c8nm"] Sep 29 18:57:02 crc kubenswrapper[4780]: W0929 18:57:02.807692 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ed62f37_b35d_477a_b70c_447698bd1b22.slice/crio-9cabac07f11f77c249ba787ba74d64d1f30031f7a0797320c5562e4d61f2e4cd WatchSource:0}: Error finding container 9cabac07f11f77c249ba787ba74d64d1f30031f7a0797320c5562e4d61f2e4cd: Status 404 returned error can't find the container with id 9cabac07f11f77c249ba787ba74d64d1f30031f7a0797320c5562e4d61f2e4cd Sep 29 18:57:03 crc kubenswrapper[4780]: I0929 18:57:03.213333 4780 generic.go:334] "Generic (PLEG): container finished" podID="390f974b-7808-487b-95b8-b72df5367294" containerID="3c6e8c61d108554a48e754fbbdf0becabdab3713a63335576064256a68550b83" exitCode=0 Sep 29 18:57:03 crc kubenswrapper[4780]: I0929 18:57:03.213451 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerDied","Data":"3c6e8c61d108554a48e754fbbdf0becabdab3713a63335576064256a68550b83"} Sep 29 18:57:03 crc kubenswrapper[4780]: I0929 18:57:03.215537 4780 generic.go:334] "Generic (PLEG): container finished" podID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerID="67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0" exitCode=0 Sep 29 18:57:03 crc kubenswrapper[4780]: I0929 18:57:03.215620 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c8nm" event={"ID":"7ed62f37-b35d-477a-b70c-447698bd1b22","Type":"ContainerDied","Data":"67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0"} Sep 29 18:57:03 crc kubenswrapper[4780]: I0929 18:57:03.215708 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c8nm" event={"ID":"7ed62f37-b35d-477a-b70c-447698bd1b22","Type":"ContainerStarted","Data":"9cabac07f11f77c249ba787ba74d64d1f30031f7a0797320c5562e4d61f2e4cd"} Sep 29 18:57:04 crc kubenswrapper[4780]: I0929 18:57:04.040992 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-4w6qq" Sep 29 18:57:04 crc kubenswrapper[4780]: I0929 18:57:04.249151 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerStarted","Data":"50e5e052308fea7e9a87462617d2bc17f934d1f342743d499c3901d7971c9106"} Sep 29 18:57:04 crc kubenswrapper[4780]: I0929 18:57:04.249278 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerStarted","Data":"19ead24ffee9d9f2d2b306313a03a4e2a2c2a497614deecf11932347f2b04841"} Sep 29 18:57:04 crc kubenswrapper[4780]: I0929 18:57:04.249299 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerStarted","Data":"1d85e21d7f7cad8e077288c5117e41291cf13460f4477142dc7ec63cf306891a"} Sep 29 18:57:04 crc kubenswrapper[4780]: I0929 18:57:04.249317 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerStarted","Data":"8348efe46641427c89660be082e526c541e663aba4b79f62185aba3cb485f26b"} Sep 29 18:57:04 crc kubenswrapper[4780]: I0929 18:57:04.249333 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerStarted","Data":"6683fb897be46543b237db9640062f60319661809b40dcb511288952bbfff9d4"} Sep 29 18:57:04 crc kubenswrapper[4780]: I0929 18:57:04.254326 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c8nm" event={"ID":"7ed62f37-b35d-477a-b70c-447698bd1b22","Type":"ContainerStarted","Data":"715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da"} Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.269171 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-flc8q" event={"ID":"390f974b-7808-487b-95b8-b72df5367294","Type":"ContainerStarted","Data":"fdf3c966060d2501fb2c420e5005b0a63a02c53a004fbed40bab04482c97cc57"} Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.269706 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-flc8q" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.271277 4780 generic.go:334] "Generic (PLEG): container finished" podID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerID="715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da" exitCode=0 Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.271347 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c8nm" event={"ID":"7ed62f37-b35d-477a-b70c-447698bd1b22","Type":"ContainerDied","Data":"715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da"} Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.321682 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-flc8q" podStartSLOduration=6.509344771 podStartE2EDuration="13.32166206s" podCreationTimestamp="2025-09-29 18:56:52 +0000 UTC" firstStartedPulling="2025-09-29 18:56:53.188345111 +0000 UTC m=+813.136643155" lastFinishedPulling="2025-09-29 18:57:00.00066239 +0000 UTC m=+819.948960444" observedRunningTime="2025-09-29 18:57:05.298405457 +0000 UTC m=+825.246703501" watchObservedRunningTime="2025-09-29 18:57:05.32166206 +0000 UTC m=+825.269960094" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.555998 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp"] Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.557547 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.561584 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.568843 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp"] Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.651805 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.651916 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.652033 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg86t\" (UniqueName: \"kubernetes.io/projected/432abf3f-794c-4850-88e9-b1d509c9dd42-kube-api-access-kg86t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.753872 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.753958 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg86t\" (UniqueName: \"kubernetes.io/projected/432abf3f-794c-4850-88e9-b1d509c9dd42-kube-api-access-kg86t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.754036 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.754512 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.754615 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.777953 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg86t\" (UniqueName: \"kubernetes.io/projected/432abf3f-794c-4850-88e9-b1d509c9dd42-kube-api-access-kg86t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:05 crc kubenswrapper[4780]: I0929 18:57:05.876828 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:06 crc kubenswrapper[4780]: I0929 18:57:06.280202 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c8nm" event={"ID":"7ed62f37-b35d-477a-b70c-447698bd1b22","Type":"ContainerStarted","Data":"efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad"} Sep 29 18:57:06 crc kubenswrapper[4780]: I0929 18:57:06.298285 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2c8nm" podStartSLOduration=2.858595013 podStartE2EDuration="5.298243448s" podCreationTimestamp="2025-09-29 18:57:01 +0000 UTC" firstStartedPulling="2025-09-29 18:57:03.217651101 +0000 UTC m=+823.165949145" lastFinishedPulling="2025-09-29 18:57:05.657299536 +0000 UTC m=+825.605597580" observedRunningTime="2025-09-29 18:57:06.297596561 +0000 UTC m=+826.245894605" watchObservedRunningTime="2025-09-29 18:57:06.298243448 +0000 UTC m=+826.246541492" Sep 29 18:57:06 crc kubenswrapper[4780]: I0929 18:57:06.325204 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp"] Sep 29 18:57:07 crc kubenswrapper[4780]: I0929 18:57:07.290643 4780 generic.go:334] "Generic (PLEG): container finished" podID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerID="2f1128db70e0dc4445d38022dcb7ed76e6c1e1441daed5dd14ab6900e7868047" exitCode=0 Sep 29 18:57:07 crc kubenswrapper[4780]: I0929 18:57:07.290711 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" event={"ID":"432abf3f-794c-4850-88e9-b1d509c9dd42","Type":"ContainerDied","Data":"2f1128db70e0dc4445d38022dcb7ed76e6c1e1441daed5dd14ab6900e7868047"} Sep 29 18:57:07 crc kubenswrapper[4780]: I0929 18:57:07.291261 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" event={"ID":"432abf3f-794c-4850-88e9-b1d509c9dd42","Type":"ContainerStarted","Data":"453196c7ee9b0dad00237f856cd5c5b93ac73ac923711fe25477ab98ca0bcb01"} Sep 29 18:57:08 crc kubenswrapper[4780]: I0929 18:57:08.038588 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-flc8q" Sep 29 18:57:08 crc kubenswrapper[4780]: I0929 18:57:08.102881 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-flc8q" Sep 29 18:57:11 crc kubenswrapper[4780]: I0929 18:57:11.325570 4780 generic.go:334] "Generic (PLEG): container finished" podID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerID="1f375302bffab2e51e0068a80030e735077af3d9fee11ecaa6a8866fdbda0d83" exitCode=0 Sep 29 18:57:11 crc kubenswrapper[4780]: I0929 18:57:11.325621 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" event={"ID":"432abf3f-794c-4850-88e9-b1d509c9dd42","Type":"ContainerDied","Data":"1f375302bffab2e51e0068a80030e735077af3d9fee11ecaa6a8866fdbda0d83"} Sep 29 18:57:12 crc kubenswrapper[4780]: I0929 18:57:12.268193 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:12 crc kubenswrapper[4780]: I0929 18:57:12.268316 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:12 crc kubenswrapper[4780]: I0929 18:57:12.319650 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:12 crc kubenswrapper[4780]: I0929 18:57:12.350701 4780 generic.go:334] "Generic (PLEG): container finished" podID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerID="dbcbfe54de24ba7fdfb536fd9f4c48cc4334e9269f8b03d521733e4c1cb37bc8" exitCode=0 Sep 29 18:57:12 crc kubenswrapper[4780]: I0929 18:57:12.350776 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" event={"ID":"432abf3f-794c-4850-88e9-b1d509c9dd42","Type":"ContainerDied","Data":"dbcbfe54de24ba7fdfb536fd9f4c48cc4334e9269f8b03d521733e4c1cb37bc8"} Sep 29 18:57:12 crc kubenswrapper[4780]: I0929 18:57:12.389857 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:12 crc kubenswrapper[4780]: I0929 18:57:12.599596 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-zchw7" Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.040442 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-flc8q" Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.053004 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-zwqlr" Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.690011 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.773447 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-bundle\") pod \"432abf3f-794c-4850-88e9-b1d509c9dd42\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.773516 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg86t\" (UniqueName: \"kubernetes.io/projected/432abf3f-794c-4850-88e9-b1d509c9dd42-kube-api-access-kg86t\") pod \"432abf3f-794c-4850-88e9-b1d509c9dd42\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.773551 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-util\") pod \"432abf3f-794c-4850-88e9-b1d509c9dd42\" (UID: \"432abf3f-794c-4850-88e9-b1d509c9dd42\") " Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.774690 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-bundle" (OuterVolumeSpecName: "bundle") pod "432abf3f-794c-4850-88e9-b1d509c9dd42" (UID: "432abf3f-794c-4850-88e9-b1d509c9dd42"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.784785 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-util" (OuterVolumeSpecName: "util") pod "432abf3f-794c-4850-88e9-b1d509c9dd42" (UID: "432abf3f-794c-4850-88e9-b1d509c9dd42"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.791329 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/432abf3f-794c-4850-88e9-b1d509c9dd42-kube-api-access-kg86t" (OuterVolumeSpecName: "kube-api-access-kg86t") pod "432abf3f-794c-4850-88e9-b1d509c9dd42" (UID: "432abf3f-794c-4850-88e9-b1d509c9dd42"). InnerVolumeSpecName "kube-api-access-kg86t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.875914 4780 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.875961 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg86t\" (UniqueName: \"kubernetes.io/projected/432abf3f-794c-4850-88e9-b1d509c9dd42-kube-api-access-kg86t\") on node \"crc\" DevicePath \"\"" Sep 29 18:57:13 crc kubenswrapper[4780]: I0929 18:57:13.875973 4780 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/432abf3f-794c-4850-88e9-b1d509c9dd42-util\") on node \"crc\" DevicePath \"\"" Sep 29 18:57:14 crc kubenswrapper[4780]: I0929 18:57:14.365209 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" event={"ID":"432abf3f-794c-4850-88e9-b1d509c9dd42","Type":"ContainerDied","Data":"453196c7ee9b0dad00237f856cd5c5b93ac73ac923711fe25477ab98ca0bcb01"} Sep 29 18:57:14 crc kubenswrapper[4780]: I0929 18:57:14.365287 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp" Sep 29 18:57:14 crc kubenswrapper[4780]: I0929 18:57:14.365300 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="453196c7ee9b0dad00237f856cd5c5b93ac73ac923711fe25477ab98ca0bcb01" Sep 29 18:57:15 crc kubenswrapper[4780]: I0929 18:57:15.294275 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2c8nm"] Sep 29 18:57:15 crc kubenswrapper[4780]: I0929 18:57:15.294511 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2c8nm" podUID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerName="registry-server" containerID="cri-o://efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad" gracePeriod=2 Sep 29 18:57:15 crc kubenswrapper[4780]: I0929 18:57:15.844346 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.012179 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc49k\" (UniqueName: \"kubernetes.io/projected/7ed62f37-b35d-477a-b70c-447698bd1b22-kube-api-access-nc49k\") pod \"7ed62f37-b35d-477a-b70c-447698bd1b22\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.012266 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-utilities\") pod \"7ed62f37-b35d-477a-b70c-447698bd1b22\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.012390 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-catalog-content\") pod \"7ed62f37-b35d-477a-b70c-447698bd1b22\" (UID: \"7ed62f37-b35d-477a-b70c-447698bd1b22\") " Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.013488 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-utilities" (OuterVolumeSpecName: "utilities") pod "7ed62f37-b35d-477a-b70c-447698bd1b22" (UID: "7ed62f37-b35d-477a-b70c-447698bd1b22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.038402 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed62f37-b35d-477a-b70c-447698bd1b22-kube-api-access-nc49k" (OuterVolumeSpecName: "kube-api-access-nc49k") pod "7ed62f37-b35d-477a-b70c-447698bd1b22" (UID: "7ed62f37-b35d-477a-b70c-447698bd1b22"). InnerVolumeSpecName "kube-api-access-nc49k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.090145 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ed62f37-b35d-477a-b70c-447698bd1b22" (UID: "7ed62f37-b35d-477a-b70c-447698bd1b22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.115125 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.115177 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc49k\" (UniqueName: \"kubernetes.io/projected/7ed62f37-b35d-477a-b70c-447698bd1b22-kube-api-access-nc49k\") on node \"crc\" DevicePath \"\"" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.115195 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ed62f37-b35d-477a-b70c-447698bd1b22-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.380133 4780 generic.go:334] "Generic (PLEG): container finished" podID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerID="efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad" exitCode=0 Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.380204 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c8nm" event={"ID":"7ed62f37-b35d-477a-b70c-447698bd1b22","Type":"ContainerDied","Data":"efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad"} Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.380224 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c8nm" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.380254 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c8nm" event={"ID":"7ed62f37-b35d-477a-b70c-447698bd1b22","Type":"ContainerDied","Data":"9cabac07f11f77c249ba787ba74d64d1f30031f7a0797320c5562e4d61f2e4cd"} Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.380280 4780 scope.go:117] "RemoveContainer" containerID="efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.403378 4780 scope.go:117] "RemoveContainer" containerID="715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.418583 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2c8nm"] Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.438280 4780 scope.go:117] "RemoveContainer" containerID="67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.440021 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2c8nm"] Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.459158 4780 scope.go:117] "RemoveContainer" containerID="efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad" Sep 29 18:57:16 crc kubenswrapper[4780]: E0929 18:57:16.459933 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad\": container with ID starting with efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad not found: ID does not exist" containerID="efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.459973 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad"} err="failed to get container status \"efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad\": rpc error: code = NotFound desc = could not find container \"efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad\": container with ID starting with efd79d0382c2616be959ab7c7529dff4c0ff734a6720b19de12824d51a09f9ad not found: ID does not exist" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.460000 4780 scope.go:117] "RemoveContainer" containerID="715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da" Sep 29 18:57:16 crc kubenswrapper[4780]: E0929 18:57:16.460613 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da\": container with ID starting with 715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da not found: ID does not exist" containerID="715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.460647 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da"} err="failed to get container status \"715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da\": rpc error: code = NotFound desc = could not find container \"715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da\": container with ID starting with 715f78fba18d89f54ece47683c5349605c730a7fe27c676fa9d3a2b1ace641da not found: ID does not exist" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.460664 4780 scope.go:117] "RemoveContainer" containerID="67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0" Sep 29 18:57:16 crc kubenswrapper[4780]: E0929 18:57:16.461025 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0\": container with ID starting with 67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0 not found: ID does not exist" containerID="67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.461106 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0"} err="failed to get container status \"67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0\": rpc error: code = NotFound desc = could not find container \"67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0\": container with ID starting with 67de48153f7a12ff9f51cfa06261edc2f3c3deb5d13909de17844779ccf642f0 not found: ID does not exist" Sep 29 18:57:16 crc kubenswrapper[4780]: I0929 18:57:16.760753 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed62f37-b35d-477a-b70c-447698bd1b22" path="/var/lib/kubelet/pods/7ed62f37-b35d-477a-b70c-447698bd1b22/volumes" Sep 29 18:57:17 crc kubenswrapper[4780]: I0929 18:57:17.998865 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt"] Sep 29 18:57:18 crc kubenswrapper[4780]: E0929 18:57:17.999684 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerName="registry-server" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:17.999701 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerName="registry-server" Sep 29 18:57:18 crc kubenswrapper[4780]: E0929 18:57:17.999709 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerName="util" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:17.999717 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerName="util" Sep 29 18:57:18 crc kubenswrapper[4780]: E0929 18:57:17.999735 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerName="pull" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:17.999743 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerName="pull" Sep 29 18:57:18 crc kubenswrapper[4780]: E0929 18:57:17.999761 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerName="extract" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:17.999768 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerName="extract" Sep 29 18:57:18 crc kubenswrapper[4780]: E0929 18:57:17.999779 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerName="extract-content" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:17.999786 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerName="extract-content" Sep 29 18:57:18 crc kubenswrapper[4780]: E0929 18:57:17.999797 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerName="extract-utilities" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:17.999806 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerName="extract-utilities" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:17.999926 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed62f37-b35d-477a-b70c-447698bd1b22" containerName="registry-server" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:17.999937 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="432abf3f-794c-4850-88e9-b1d509c9dd42" containerName="extract" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.000504 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.003287 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.003649 4780 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-vmbqj" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.006817 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.021349 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt"] Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.150593 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncdz\" (UniqueName: \"kubernetes.io/projected/cd59a112-f27b-4c63-9ff0-258d295c5109-kube-api-access-nncdz\") pod \"cert-manager-operator-controller-manager-57cd46d6d-zcdpt\" (UID: \"cd59a112-f27b-4c63-9ff0-258d295c5109\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.252666 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nncdz\" (UniqueName: \"kubernetes.io/projected/cd59a112-f27b-4c63-9ff0-258d295c5109-kube-api-access-nncdz\") pod \"cert-manager-operator-controller-manager-57cd46d6d-zcdpt\" (UID: \"cd59a112-f27b-4c63-9ff0-258d295c5109\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.286538 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncdz\" (UniqueName: \"kubernetes.io/projected/cd59a112-f27b-4c63-9ff0-258d295c5109-kube-api-access-nncdz\") pod \"cert-manager-operator-controller-manager-57cd46d6d-zcdpt\" (UID: \"cd59a112-f27b-4c63-9ff0-258d295c5109\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.316475 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt" Sep 29 18:57:18 crc kubenswrapper[4780]: I0929 18:57:18.678097 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt"] Sep 29 18:57:18 crc kubenswrapper[4780]: W0929 18:57:18.692621 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd59a112_f27b_4c63_9ff0_258d295c5109.slice/crio-5ca0bfcb148657f19a9a3f9506e27ecc64c8a144cdaff2f59c57665e3d0a6d9c WatchSource:0}: Error finding container 5ca0bfcb148657f19a9a3f9506e27ecc64c8a144cdaff2f59c57665e3d0a6d9c: Status 404 returned error can't find the container with id 5ca0bfcb148657f19a9a3f9506e27ecc64c8a144cdaff2f59c57665e3d0a6d9c Sep 29 18:57:19 crc kubenswrapper[4780]: I0929 18:57:19.408405 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt" event={"ID":"cd59a112-f27b-4c63-9ff0-258d295c5109","Type":"ContainerStarted","Data":"5ca0bfcb148657f19a9a3f9506e27ecc64c8a144cdaff2f59c57665e3d0a6d9c"} Sep 29 18:57:26 crc kubenswrapper[4780]: I0929 18:57:26.460124 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt" event={"ID":"cd59a112-f27b-4c63-9ff0-258d295c5109","Type":"ContainerStarted","Data":"093032744cb837d67d73faa60c9f17a1f20b015a99916040a2c17a2823801b42"} Sep 29 18:57:26 crc kubenswrapper[4780]: I0929 18:57:26.485570 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-zcdpt" podStartSLOduration=2.522831331 podStartE2EDuration="9.485548109s" podCreationTimestamp="2025-09-29 18:57:17 +0000 UTC" firstStartedPulling="2025-09-29 18:57:18.697062164 +0000 UTC m=+838.645360208" lastFinishedPulling="2025-09-29 18:57:25.659778942 +0000 UTC m=+845.608076986" observedRunningTime="2025-09-29 18:57:26.483943143 +0000 UTC m=+846.432241207" watchObservedRunningTime="2025-09-29 18:57:26.485548109 +0000 UTC m=+846.433846163" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.370262 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-4hvmj"] Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.371595 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.375440 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.375883 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.376420 4780 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vdblv" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.406265 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b94c3657-f10a-48a3-a551-696d692130cb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-4hvmj\" (UID: \"b94c3657-f10a-48a3-a551-696d692130cb\") " pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.406354 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqdm\" (UniqueName: \"kubernetes.io/projected/b94c3657-f10a-48a3-a551-696d692130cb-kube-api-access-5jqdm\") pod \"cert-manager-webhook-d969966f-4hvmj\" (UID: \"b94c3657-f10a-48a3-a551-696d692130cb\") " pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.439413 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-4hvmj"] Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.507645 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqdm\" (UniqueName: \"kubernetes.io/projected/b94c3657-f10a-48a3-a551-696d692130cb-kube-api-access-5jqdm\") pod \"cert-manager-webhook-d969966f-4hvmj\" (UID: \"b94c3657-f10a-48a3-a551-696d692130cb\") " pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.507748 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b94c3657-f10a-48a3-a551-696d692130cb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-4hvmj\" (UID: \"b94c3657-f10a-48a3-a551-696d692130cb\") " pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.532778 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b94c3657-f10a-48a3-a551-696d692130cb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-4hvmj\" (UID: \"b94c3657-f10a-48a3-a551-696d692130cb\") " pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.533020 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqdm\" (UniqueName: \"kubernetes.io/projected/b94c3657-f10a-48a3-a551-696d692130cb-kube-api-access-5jqdm\") pod \"cert-manager-webhook-d969966f-4hvmj\" (UID: \"b94c3657-f10a-48a3-a551-696d692130cb\") " pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:28 crc kubenswrapper[4780]: I0929 18:57:28.692183 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:29 crc kubenswrapper[4780]: I0929 18:57:29.179701 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-4hvmj"] Sep 29 18:57:29 crc kubenswrapper[4780]: I0929 18:57:29.481293 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" event={"ID":"b94c3657-f10a-48a3-a551-696d692130cb","Type":"ContainerStarted","Data":"0abf9e3fe1cd7e4ba04ab17658032793aa0dd20bc8ddc858e3345e26c1c82da8"} Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.542322 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw"] Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.543812 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.547623 4780 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-s7h2j" Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.567226 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw"] Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.646289 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6sqw\" (UniqueName: \"kubernetes.io/projected/6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a-kube-api-access-h6sqw\") pod \"cert-manager-cainjector-7d9f95dbf-p2mzw\" (UID: \"6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.647517 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-p2mzw\" (UID: \"6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.752561 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-p2mzw\" (UID: \"6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.752655 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6sqw\" (UniqueName: \"kubernetes.io/projected/6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a-kube-api-access-h6sqw\") pod \"cert-manager-cainjector-7d9f95dbf-p2mzw\" (UID: \"6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.782270 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-p2mzw\" (UID: \"6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.784379 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6sqw\" (UniqueName: \"kubernetes.io/projected/6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a-kube-api-access-h6sqw\") pod \"cert-manager-cainjector-7d9f95dbf-p2mzw\" (UID: \"6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" Sep 29 18:57:30 crc kubenswrapper[4780]: I0929 18:57:30.865913 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" Sep 29 18:57:31 crc kubenswrapper[4780]: I0929 18:57:31.310475 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw"] Sep 29 18:57:31 crc kubenswrapper[4780]: I0929 18:57:31.499301 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" event={"ID":"6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a","Type":"ContainerStarted","Data":"b04395b7550ec010a0c2892cabc7196d73af6ca33698aae6335ef9fc271f9a95"} Sep 29 18:57:34 crc kubenswrapper[4780]: I0929 18:57:34.520638 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" event={"ID":"6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a","Type":"ContainerStarted","Data":"ff00948300689a04cdea50fe014a937aa35b273927752a4c4b3e0917341634af"} Sep 29 18:57:34 crc kubenswrapper[4780]: I0929 18:57:34.522937 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" event={"ID":"b94c3657-f10a-48a3-a551-696d692130cb","Type":"ContainerStarted","Data":"4debadb98930b827f72a44679ea42641c6ad62bd66f8c440b45480a8d6ba4343"} Sep 29 18:57:34 crc kubenswrapper[4780]: I0929 18:57:34.523003 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:34 crc kubenswrapper[4780]: I0929 18:57:34.539809 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-p2mzw" podStartSLOduration=2.284728301 podStartE2EDuration="4.539779202s" podCreationTimestamp="2025-09-29 18:57:30 +0000 UTC" firstStartedPulling="2025-09-29 18:57:31.324173865 +0000 UTC m=+851.272471909" lastFinishedPulling="2025-09-29 18:57:33.579224766 +0000 UTC m=+853.527522810" observedRunningTime="2025-09-29 18:57:34.538531547 +0000 UTC m=+854.486829591" watchObservedRunningTime="2025-09-29 18:57:34.539779202 +0000 UTC m=+854.488077246" Sep 29 18:57:34 crc kubenswrapper[4780]: I0929 18:57:34.573870 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" podStartSLOduration=2.180752515 podStartE2EDuration="6.573841817s" podCreationTimestamp="2025-09-29 18:57:28 +0000 UTC" firstStartedPulling="2025-09-29 18:57:29.186710681 +0000 UTC m=+849.135008725" lastFinishedPulling="2025-09-29 18:57:33.579799983 +0000 UTC m=+853.528098027" observedRunningTime="2025-09-29 18:57:34.57253159 +0000 UTC m=+854.520829624" watchObservedRunningTime="2025-09-29 18:57:34.573841817 +0000 UTC m=+854.522139871" Sep 29 18:57:38 crc kubenswrapper[4780]: I0929 18:57:38.695247 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-4hvmj" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.173575 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-lwn5c"] Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.174773 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.183289 4780 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-fqn4x" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.189014 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-lwn5c"] Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.288717 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bcdc02e7-5830-4c51-abac-696d1744137e-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-lwn5c\" (UID: \"bcdc02e7-5830-4c51-abac-696d1744137e\") " pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.289161 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m4tb\" (UniqueName: \"kubernetes.io/projected/bcdc02e7-5830-4c51-abac-696d1744137e-kube-api-access-7m4tb\") pod \"cert-manager-7d4cc89fcb-lwn5c\" (UID: \"bcdc02e7-5830-4c51-abac-696d1744137e\") " pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.390583 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bcdc02e7-5830-4c51-abac-696d1744137e-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-lwn5c\" (UID: \"bcdc02e7-5830-4c51-abac-696d1744137e\") " pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.390716 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m4tb\" (UniqueName: \"kubernetes.io/projected/bcdc02e7-5830-4c51-abac-696d1744137e-kube-api-access-7m4tb\") pod \"cert-manager-7d4cc89fcb-lwn5c\" (UID: \"bcdc02e7-5830-4c51-abac-696d1744137e\") " pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.412272 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m4tb\" (UniqueName: \"kubernetes.io/projected/bcdc02e7-5830-4c51-abac-696d1744137e-kube-api-access-7m4tb\") pod \"cert-manager-7d4cc89fcb-lwn5c\" (UID: \"bcdc02e7-5830-4c51-abac-696d1744137e\") " pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.413373 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bcdc02e7-5830-4c51-abac-696d1744137e-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-lwn5c\" (UID: \"bcdc02e7-5830-4c51-abac-696d1744137e\") " pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.497914 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" Sep 29 18:57:39 crc kubenswrapper[4780]: I0929 18:57:39.928382 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-lwn5c"] Sep 29 18:57:40 crc kubenswrapper[4780]: I0929 18:57:40.566291 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" event={"ID":"bcdc02e7-5830-4c51-abac-696d1744137e","Type":"ContainerStarted","Data":"a3add3a44d3ad2801e3719c23a3524e9d5d450c5ba61e0dc94735b1977e8cc95"} Sep 29 18:57:40 crc kubenswrapper[4780]: I0929 18:57:40.566657 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" event={"ID":"bcdc02e7-5830-4c51-abac-696d1744137e","Type":"ContainerStarted","Data":"3f8ddd3907c54a6e0098e875e291096c6915cef07689dc6f0195cb083b9bdce8"} Sep 29 18:57:40 crc kubenswrapper[4780]: I0929 18:57:40.588393 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-lwn5c" podStartSLOduration=1.588363954 podStartE2EDuration="1.588363954s" podCreationTimestamp="2025-09-29 18:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:57:40.583546856 +0000 UTC m=+860.531844900" watchObservedRunningTime="2025-09-29 18:57:40.588363954 +0000 UTC m=+860.536662008" Sep 29 18:57:51 crc kubenswrapper[4780]: I0929 18:57:51.850221 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-s5rlp"] Sep 29 18:57:51 crc kubenswrapper[4780]: I0929 18:57:51.851891 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s5rlp" Sep 29 18:57:51 crc kubenswrapper[4780]: I0929 18:57:51.854678 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 29 18:57:51 crc kubenswrapper[4780]: I0929 18:57:51.859204 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-grxxc" Sep 29 18:57:51 crc kubenswrapper[4780]: I0929 18:57:51.859293 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 29 18:57:51 crc kubenswrapper[4780]: I0929 18:57:51.865182 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s5rlp"] Sep 29 18:57:51 crc kubenswrapper[4780]: I0929 18:57:51.888365 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slvsj\" (UniqueName: \"kubernetes.io/projected/e2179424-c9c3-4891-b07e-712b8c30012a-kube-api-access-slvsj\") pod \"openstack-operator-index-s5rlp\" (UID: \"e2179424-c9c3-4891-b07e-712b8c30012a\") " pod="openstack-operators/openstack-operator-index-s5rlp" Sep 29 18:57:51 crc kubenswrapper[4780]: I0929 18:57:51.989584 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slvsj\" (UniqueName: \"kubernetes.io/projected/e2179424-c9c3-4891-b07e-712b8c30012a-kube-api-access-slvsj\") pod \"openstack-operator-index-s5rlp\" (UID: \"e2179424-c9c3-4891-b07e-712b8c30012a\") " pod="openstack-operators/openstack-operator-index-s5rlp" Sep 29 18:57:52 crc kubenswrapper[4780]: I0929 18:57:52.008300 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slvsj\" (UniqueName: \"kubernetes.io/projected/e2179424-c9c3-4891-b07e-712b8c30012a-kube-api-access-slvsj\") pod \"openstack-operator-index-s5rlp\" (UID: \"e2179424-c9c3-4891-b07e-712b8c30012a\") " pod="openstack-operators/openstack-operator-index-s5rlp" Sep 29 18:57:52 crc kubenswrapper[4780]: I0929 18:57:52.173974 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s5rlp" Sep 29 18:57:52 crc kubenswrapper[4780]: I0929 18:57:52.611358 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s5rlp"] Sep 29 18:57:52 crc kubenswrapper[4780]: W0929 18:57:52.617230 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2179424_c9c3_4891_b07e_712b8c30012a.slice/crio-cea66832f64ebfa43806e0f89c6b56e583a505ed1929ba47e45dc93dd95da1ce WatchSource:0}: Error finding container cea66832f64ebfa43806e0f89c6b56e583a505ed1929ba47e45dc93dd95da1ce: Status 404 returned error can't find the container with id cea66832f64ebfa43806e0f89c6b56e583a505ed1929ba47e45dc93dd95da1ce Sep 29 18:57:52 crc kubenswrapper[4780]: I0929 18:57:52.660768 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s5rlp" event={"ID":"e2179424-c9c3-4891-b07e-712b8c30012a","Type":"ContainerStarted","Data":"cea66832f64ebfa43806e0f89c6b56e583a505ed1929ba47e45dc93dd95da1ce"} Sep 29 18:57:53 crc kubenswrapper[4780]: I0929 18:57:53.669789 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s5rlp" event={"ID":"e2179424-c9c3-4891-b07e-712b8c30012a","Type":"ContainerStarted","Data":"3223e17d3e8fa141bd875b403808e00971ff05c762a12fcc92dabbec4d57a330"} Sep 29 18:57:53 crc kubenswrapper[4780]: I0929 18:57:53.685456 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-s5rlp" podStartSLOduration=1.889320572 podStartE2EDuration="2.685428441s" podCreationTimestamp="2025-09-29 18:57:51 +0000 UTC" firstStartedPulling="2025-09-29 18:57:52.62131514 +0000 UTC m=+872.569613204" lastFinishedPulling="2025-09-29 18:57:53.417423009 +0000 UTC m=+873.365721073" observedRunningTime="2025-09-29 18:57:53.684870335 +0000 UTC m=+873.633168379" watchObservedRunningTime="2025-09-29 18:57:53.685428441 +0000 UTC m=+873.633726475" Sep 29 18:57:55 crc kubenswrapper[4780]: I0929 18:57:55.223104 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s5rlp"] Sep 29 18:57:55 crc kubenswrapper[4780]: I0929 18:57:55.683491 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-s5rlp" podUID="e2179424-c9c3-4891-b07e-712b8c30012a" containerName="registry-server" containerID="cri-o://3223e17d3e8fa141bd875b403808e00971ff05c762a12fcc92dabbec4d57a330" gracePeriod=2 Sep 29 18:57:55 crc kubenswrapper[4780]: I0929 18:57:55.825655 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hrkrx"] Sep 29 18:57:55 crc kubenswrapper[4780]: I0929 18:57:55.826588 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hrkrx" Sep 29 18:57:55 crc kubenswrapper[4780]: I0929 18:57:55.840189 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hrkrx"] Sep 29 18:57:55 crc kubenswrapper[4780]: I0929 18:57:55.955291 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxpvc\" (UniqueName: \"kubernetes.io/projected/8cb0a136-583a-4403-924a-bdedd686f874-kube-api-access-dxpvc\") pod \"openstack-operator-index-hrkrx\" (UID: \"8cb0a136-583a-4403-924a-bdedd686f874\") " pod="openstack-operators/openstack-operator-index-hrkrx" Sep 29 18:57:56 crc kubenswrapper[4780]: I0929 18:57:56.056686 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxpvc\" (UniqueName: \"kubernetes.io/projected/8cb0a136-583a-4403-924a-bdedd686f874-kube-api-access-dxpvc\") pod \"openstack-operator-index-hrkrx\" (UID: \"8cb0a136-583a-4403-924a-bdedd686f874\") " pod="openstack-operators/openstack-operator-index-hrkrx" Sep 29 18:57:56 crc kubenswrapper[4780]: I0929 18:57:56.085151 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxpvc\" (UniqueName: \"kubernetes.io/projected/8cb0a136-583a-4403-924a-bdedd686f874-kube-api-access-dxpvc\") pod \"openstack-operator-index-hrkrx\" (UID: \"8cb0a136-583a-4403-924a-bdedd686f874\") " pod="openstack-operators/openstack-operator-index-hrkrx" Sep 29 18:57:56 crc kubenswrapper[4780]: I0929 18:57:56.152296 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hrkrx" Sep 29 18:57:56 crc kubenswrapper[4780]: I0929 18:57:56.587214 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hrkrx"] Sep 29 18:57:56 crc kubenswrapper[4780]: I0929 18:57:56.694529 4780 generic.go:334] "Generic (PLEG): container finished" podID="e2179424-c9c3-4891-b07e-712b8c30012a" containerID="3223e17d3e8fa141bd875b403808e00971ff05c762a12fcc92dabbec4d57a330" exitCode=0 Sep 29 18:57:56 crc kubenswrapper[4780]: I0929 18:57:56.694598 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s5rlp" event={"ID":"e2179424-c9c3-4891-b07e-712b8c30012a","Type":"ContainerDied","Data":"3223e17d3e8fa141bd875b403808e00971ff05c762a12fcc92dabbec4d57a330"} Sep 29 18:57:56 crc kubenswrapper[4780]: I0929 18:57:56.696088 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hrkrx" event={"ID":"8cb0a136-583a-4403-924a-bdedd686f874","Type":"ContainerStarted","Data":"01781a1ec77627226df91454e373668bc219b69b4c224e08f64cd3d7444285a8"} Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.321101 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s5rlp" Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.379572 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slvsj\" (UniqueName: \"kubernetes.io/projected/e2179424-c9c3-4891-b07e-712b8c30012a-kube-api-access-slvsj\") pod \"e2179424-c9c3-4891-b07e-712b8c30012a\" (UID: \"e2179424-c9c3-4891-b07e-712b8c30012a\") " Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.386164 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2179424-c9c3-4891-b07e-712b8c30012a-kube-api-access-slvsj" (OuterVolumeSpecName: "kube-api-access-slvsj") pod "e2179424-c9c3-4891-b07e-712b8c30012a" (UID: "e2179424-c9c3-4891-b07e-712b8c30012a"). InnerVolumeSpecName "kube-api-access-slvsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.481283 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slvsj\" (UniqueName: \"kubernetes.io/projected/e2179424-c9c3-4891-b07e-712b8c30012a-kube-api-access-slvsj\") on node \"crc\" DevicePath \"\"" Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.706336 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s5rlp" Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.706342 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s5rlp" event={"ID":"e2179424-c9c3-4891-b07e-712b8c30012a","Type":"ContainerDied","Data":"cea66832f64ebfa43806e0f89c6b56e583a505ed1929ba47e45dc93dd95da1ce"} Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.706433 4780 scope.go:117] "RemoveContainer" containerID="3223e17d3e8fa141bd875b403808e00971ff05c762a12fcc92dabbec4d57a330" Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.708677 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hrkrx" event={"ID":"8cb0a136-583a-4403-924a-bdedd686f874","Type":"ContainerStarted","Data":"44fcbe4351ab5e072c5a4ee3db167806a5f56a6fc6f616e5c3a8424965a2269e"} Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.752615 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hrkrx" podStartSLOduration=2.294715078 podStartE2EDuration="2.752585514s" podCreationTimestamp="2025-09-29 18:57:55 +0000 UTC" firstStartedPulling="2025-09-29 18:57:56.598992492 +0000 UTC m=+876.547290536" lastFinishedPulling="2025-09-29 18:57:57.056862908 +0000 UTC m=+877.005160972" observedRunningTime="2025-09-29 18:57:57.729130242 +0000 UTC m=+877.677428326" watchObservedRunningTime="2025-09-29 18:57:57.752585514 +0000 UTC m=+877.700883588" Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.755245 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s5rlp"] Sep 29 18:57:57 crc kubenswrapper[4780]: I0929 18:57:57.763759 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-s5rlp"] Sep 29 18:57:58 crc kubenswrapper[4780]: I0929 18:57:58.770769 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2179424-c9c3-4891-b07e-712b8c30012a" path="/var/lib/kubelet/pods/e2179424-c9c3-4891-b07e-712b8c30012a/volumes" Sep 29 18:58:03 crc kubenswrapper[4780]: I0929 18:58:03.223678 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:58:03 crc kubenswrapper[4780]: I0929 18:58:03.224316 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:58:06 crc kubenswrapper[4780]: I0929 18:58:06.153303 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hrkrx" Sep 29 18:58:06 crc kubenswrapper[4780]: I0929 18:58:06.153767 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hrkrx" Sep 29 18:58:06 crc kubenswrapper[4780]: I0929 18:58:06.197858 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hrkrx" Sep 29 18:58:06 crc kubenswrapper[4780]: I0929 18:58:06.830836 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hrkrx" Sep 29 18:58:07 crc kubenswrapper[4780]: I0929 18:58:07.865529 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7"] Sep 29 18:58:07 crc kubenswrapper[4780]: E0929 18:58:07.865871 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2179424-c9c3-4891-b07e-712b8c30012a" containerName="registry-server" Sep 29 18:58:07 crc kubenswrapper[4780]: I0929 18:58:07.865886 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2179424-c9c3-4891-b07e-712b8c30012a" containerName="registry-server" Sep 29 18:58:07 crc kubenswrapper[4780]: I0929 18:58:07.866102 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2179424-c9c3-4891-b07e-712b8c30012a" containerName="registry-server" Sep 29 18:58:07 crc kubenswrapper[4780]: I0929 18:58:07.867187 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:07 crc kubenswrapper[4780]: I0929 18:58:07.869761 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hjbh7" Sep 29 18:58:07 crc kubenswrapper[4780]: I0929 18:58:07.876263 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7"] Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.037160 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-util\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.037920 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmxnz\" (UniqueName: \"kubernetes.io/projected/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-kube-api-access-bmxnz\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.037976 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-bundle\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.139830 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-util\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.139904 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmxnz\" (UniqueName: \"kubernetes.io/projected/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-kube-api-access-bmxnz\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.139974 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-bundle\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.140830 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-bundle\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.140836 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-util\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.164891 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmxnz\" (UniqueName: \"kubernetes.io/projected/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-kube-api-access-bmxnz\") pod \"5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.192572 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.611722 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7"] Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.801180 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" event={"ID":"3ced0c80-3dc4-4f78-95d1-eaaa021aad95","Type":"ContainerStarted","Data":"db9a1ab5693b0e0176ab5872a25d3f123af5be26fb6fed5325ff76258aee4c79"} Sep 29 18:58:08 crc kubenswrapper[4780]: I0929 18:58:08.801237 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" event={"ID":"3ced0c80-3dc4-4f78-95d1-eaaa021aad95","Type":"ContainerStarted","Data":"b7cd6d6cd86bc4d662017ec6bc350754abd3cf8760edb9fbe0db6a5f17a7ffd8"} Sep 29 18:58:09 crc kubenswrapper[4780]: I0929 18:58:09.810684 4780 generic.go:334] "Generic (PLEG): container finished" podID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerID="db9a1ab5693b0e0176ab5872a25d3f123af5be26fb6fed5325ff76258aee4c79" exitCode=0 Sep 29 18:58:09 crc kubenswrapper[4780]: I0929 18:58:09.810978 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" event={"ID":"3ced0c80-3dc4-4f78-95d1-eaaa021aad95","Type":"ContainerDied","Data":"db9a1ab5693b0e0176ab5872a25d3f123af5be26fb6fed5325ff76258aee4c79"} Sep 29 18:58:11 crc kubenswrapper[4780]: I0929 18:58:11.825877 4780 generic.go:334] "Generic (PLEG): container finished" podID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerID="59b2c390a97df323113af87b45eb80ad59db0365107c6cde2bd6e5868765c10f" exitCode=0 Sep 29 18:58:11 crc kubenswrapper[4780]: I0929 18:58:11.825958 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" event={"ID":"3ced0c80-3dc4-4f78-95d1-eaaa021aad95","Type":"ContainerDied","Data":"59b2c390a97df323113af87b45eb80ad59db0365107c6cde2bd6e5868765c10f"} Sep 29 18:58:12 crc kubenswrapper[4780]: I0929 18:58:12.836567 4780 generic.go:334] "Generic (PLEG): container finished" podID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerID="8634ae2eda75d18d6af9f30246559898b7543366e769b5b3c482c502049b7bcf" exitCode=0 Sep 29 18:58:12 crc kubenswrapper[4780]: I0929 18:58:12.836781 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" event={"ID":"3ced0c80-3dc4-4f78-95d1-eaaa021aad95","Type":"ContainerDied","Data":"8634ae2eda75d18d6af9f30246559898b7543366e769b5b3c482c502049b7bcf"} Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.107692 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.232652 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-util\") pod \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.232844 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmxnz\" (UniqueName: \"kubernetes.io/projected/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-kube-api-access-bmxnz\") pod \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.232895 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-bundle\") pod \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\" (UID: \"3ced0c80-3dc4-4f78-95d1-eaaa021aad95\") " Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.233796 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-bundle" (OuterVolumeSpecName: "bundle") pod "3ced0c80-3dc4-4f78-95d1-eaaa021aad95" (UID: "3ced0c80-3dc4-4f78-95d1-eaaa021aad95"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.243297 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-kube-api-access-bmxnz" (OuterVolumeSpecName: "kube-api-access-bmxnz") pod "3ced0c80-3dc4-4f78-95d1-eaaa021aad95" (UID: "3ced0c80-3dc4-4f78-95d1-eaaa021aad95"). InnerVolumeSpecName "kube-api-access-bmxnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.252637 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-util" (OuterVolumeSpecName: "util") pod "3ced0c80-3dc4-4f78-95d1-eaaa021aad95" (UID: "3ced0c80-3dc4-4f78-95d1-eaaa021aad95"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.334683 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmxnz\" (UniqueName: \"kubernetes.io/projected/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-kube-api-access-bmxnz\") on node \"crc\" DevicePath \"\"" Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.334730 4780 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.334745 4780 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ced0c80-3dc4-4f78-95d1-eaaa021aad95-util\") on node \"crc\" DevicePath \"\"" Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.866308 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" event={"ID":"3ced0c80-3dc4-4f78-95d1-eaaa021aad95","Type":"ContainerDied","Data":"b7cd6d6cd86bc4d662017ec6bc350754abd3cf8760edb9fbe0db6a5f17a7ffd8"} Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.866360 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7cd6d6cd86bc4d662017ec6bc350754abd3cf8760edb9fbe0db6a5f17a7ffd8" Sep 29 18:58:14 crc kubenswrapper[4780]: I0929 18:58:14.866403 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7" Sep 29 18:58:20 crc kubenswrapper[4780]: I0929 18:58:20.850887 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv"] Sep 29 18:58:20 crc kubenswrapper[4780]: E0929 18:58:20.852084 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerName="util" Sep 29 18:58:20 crc kubenswrapper[4780]: I0929 18:58:20.852101 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerName="util" Sep 29 18:58:20 crc kubenswrapper[4780]: E0929 18:58:20.852119 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerName="pull" Sep 29 18:58:20 crc kubenswrapper[4780]: I0929 18:58:20.852125 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerName="pull" Sep 29 18:58:20 crc kubenswrapper[4780]: E0929 18:58:20.852146 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerName="extract" Sep 29 18:58:20 crc kubenswrapper[4780]: I0929 18:58:20.852153 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerName="extract" Sep 29 18:58:20 crc kubenswrapper[4780]: I0929 18:58:20.852298 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ced0c80-3dc4-4f78-95d1-eaaa021aad95" containerName="extract" Sep 29 18:58:20 crc kubenswrapper[4780]: I0929 18:58:20.853111 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" Sep 29 18:58:20 crc kubenswrapper[4780]: I0929 18:58:20.855583 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-fjtpc" Sep 29 18:58:20 crc kubenswrapper[4780]: I0929 18:58:20.880791 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv"] Sep 29 18:58:21 crc kubenswrapper[4780]: I0929 18:58:21.028632 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqwxx\" (UniqueName: \"kubernetes.io/projected/727c9941-0992-4a9c-8c56-ffa31bb24cf4-kube-api-access-hqwxx\") pod \"openstack-operator-controller-operator-56dc567787-f9qwv\" (UID: \"727c9941-0992-4a9c-8c56-ffa31bb24cf4\") " pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" Sep 29 18:58:21 crc kubenswrapper[4780]: I0929 18:58:21.129987 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqwxx\" (UniqueName: \"kubernetes.io/projected/727c9941-0992-4a9c-8c56-ffa31bb24cf4-kube-api-access-hqwxx\") pod \"openstack-operator-controller-operator-56dc567787-f9qwv\" (UID: \"727c9941-0992-4a9c-8c56-ffa31bb24cf4\") " pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" Sep 29 18:58:21 crc kubenswrapper[4780]: I0929 18:58:21.152832 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqwxx\" (UniqueName: \"kubernetes.io/projected/727c9941-0992-4a9c-8c56-ffa31bb24cf4-kube-api-access-hqwxx\") pod \"openstack-operator-controller-operator-56dc567787-f9qwv\" (UID: \"727c9941-0992-4a9c-8c56-ffa31bb24cf4\") " pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" Sep 29 18:58:21 crc kubenswrapper[4780]: I0929 18:58:21.172663 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" Sep 29 18:58:21 crc kubenswrapper[4780]: I0929 18:58:21.635513 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv"] Sep 29 18:58:21 crc kubenswrapper[4780]: I0929 18:58:21.913931 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" event={"ID":"727c9941-0992-4a9c-8c56-ffa31bb24cf4","Type":"ContainerStarted","Data":"de37bf815d9f6fee0b43608777f888b052719a5b668b1b10b0f3eb4b5139462a"} Sep 29 18:58:26 crc kubenswrapper[4780]: I0929 18:58:26.973821 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" event={"ID":"727c9941-0992-4a9c-8c56-ffa31bb24cf4","Type":"ContainerStarted","Data":"2c8a6f72c7b6c7b9582c2941bf5bb4a59e0819944a49c8a05524d972bf3aad99"} Sep 29 18:58:28 crc kubenswrapper[4780]: I0929 18:58:28.991873 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" event={"ID":"727c9941-0992-4a9c-8c56-ffa31bb24cf4","Type":"ContainerStarted","Data":"7d693e351c850bacbdb7e56ff8f0837febaeee0afdc60816f9d39d58045b5333"} Sep 29 18:58:28 crc kubenswrapper[4780]: I0929 18:58:28.992395 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" Sep 29 18:58:29 crc kubenswrapper[4780]: I0929 18:58:29.025609 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" podStartSLOduration=2.494387251 podStartE2EDuration="9.025582667s" podCreationTimestamp="2025-09-29 18:58:20 +0000 UTC" firstStartedPulling="2025-09-29 18:58:21.645173862 +0000 UTC m=+901.593471906" lastFinishedPulling="2025-09-29 18:58:28.176369278 +0000 UTC m=+908.124667322" observedRunningTime="2025-09-29 18:58:29.018998749 +0000 UTC m=+908.967296803" watchObservedRunningTime="2025-09-29 18:58:29.025582667 +0000 UTC m=+908.973880711" Sep 29 18:58:31 crc kubenswrapper[4780]: I0929 18:58:31.177655 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-56dc567787-f9qwv" Sep 29 18:58:33 crc kubenswrapper[4780]: I0929 18:58:33.223934 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:58:33 crc kubenswrapper[4780]: I0929 18:58:33.224845 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.709639 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.711591 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.719472 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-bk8k5" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.734911 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.736370 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.742842 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5hfzs" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.749711 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.751124 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.753526 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gkml5" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.762548 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.769218 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.773155 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2x5w\" (UniqueName: \"kubernetes.io/projected/8add36ee-ae48-47aa-a1b8-39e26a2b61c4-kube-api-access-b2x5w\") pod \"cinder-operator-controller-manager-859cd486d-n9smm\" (UID: \"8add36ee-ae48-47aa-a1b8-39e26a2b61c4\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.773227 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56sf\" (UniqueName: \"kubernetes.io/projected/f488a5b4-5b60-4e98-9095-5c6b3e7d580b-kube-api-access-m56sf\") pod \"barbican-operator-controller-manager-f7f98cb69-qm8gn\" (UID: \"f488a5b4-5b60-4e98-9095-5c6b3e7d580b\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.856021 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.870195 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.872474 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.874406 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2x5w\" (UniqueName: \"kubernetes.io/projected/8add36ee-ae48-47aa-a1b8-39e26a2b61c4-kube-api-access-b2x5w\") pod \"cinder-operator-controller-manager-859cd486d-n9smm\" (UID: \"8add36ee-ae48-47aa-a1b8-39e26a2b61c4\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.874491 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhn2p\" (UniqueName: \"kubernetes.io/projected/afe8c052-ff7e-4892-81fa-8045f69346eb-kube-api-access-hhn2p\") pod \"designate-operator-controller-manager-77fb7bcf5b-xhck5\" (UID: \"afe8c052-ff7e-4892-81fa-8045f69346eb\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.874527 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m56sf\" (UniqueName: \"kubernetes.io/projected/f488a5b4-5b60-4e98-9095-5c6b3e7d580b-kube-api-access-m56sf\") pod \"barbican-operator-controller-manager-f7f98cb69-qm8gn\" (UID: \"f488a5b4-5b60-4e98-9095-5c6b3e7d580b\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.877439 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-g7gxq" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.884077 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.885718 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.889024 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wfj5s" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.890383 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.891851 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.893478 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7j6bw" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.900235 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.905835 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.910333 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.927283 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.929332 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.936856 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.942600 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vs6tc" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.959705 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.962885 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2x5w\" (UniqueName: \"kubernetes.io/projected/8add36ee-ae48-47aa-a1b8-39e26a2b61c4-kube-api-access-b2x5w\") pod \"cinder-operator-controller-manager-859cd486d-n9smm\" (UID: \"8add36ee-ae48-47aa-a1b8-39e26a2b61c4\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.971415 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.974729 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56sf\" (UniqueName: \"kubernetes.io/projected/f488a5b4-5b60-4e98-9095-5c6b3e7d580b-kube-api-access-m56sf\") pod \"barbican-operator-controller-manager-f7f98cb69-qm8gn\" (UID: \"f488a5b4-5b60-4e98-9095-5c6b3e7d580b\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.975449 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4rgv\" (UniqueName: \"kubernetes.io/projected/de762ea1-08cb-48cd-8e29-2d7523a63ef8-kube-api-access-t4rgv\") pod \"heat-operator-controller-manager-5b4fc86755-f2xqf\" (UID: \"de762ea1-08cb-48cd-8e29-2d7523a63ef8\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.975483 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhn2p\" (UniqueName: \"kubernetes.io/projected/afe8c052-ff7e-4892-81fa-8045f69346eb-kube-api-access-hhn2p\") pod \"designate-operator-controller-manager-77fb7bcf5b-xhck5\" (UID: \"afe8c052-ff7e-4892-81fa-8045f69346eb\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.975511 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhvf5\" (UniqueName: \"kubernetes.io/projected/85948289-f8ff-4ccb-8322-17c68d0ca529-kube-api-access-mhvf5\") pod \"glance-operator-controller-manager-8bc4775b5-r4g5l\" (UID: \"85948289-f8ff-4ccb-8322-17c68d0ca529\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.975574 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg2rc\" (UniqueName: \"kubernetes.io/projected/53223fa3-3901-4f53-9c6b-18e07485a7ad-kube-api-access-pg2rc\") pod \"horizon-operator-controller-manager-679b4759bb-7ktf6\" (UID: \"53223fa3-3901-4f53-9c6b-18e07485a7ad\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.975607 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sssm\" (UniqueName: \"kubernetes.io/projected/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-kube-api-access-8sssm\") pod \"infra-operator-controller-manager-7d9c7d9477-jzhjc\" (UID: \"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.975607 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.975649 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-jzhjc\" (UID: \"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.983585 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.987574 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rtchr" Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.992517 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn"] Sep 29 18:58:46 crc kubenswrapper[4780]: I0929 18:58:46.993654 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.012449 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xtgpv" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.013414 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhn2p\" (UniqueName: \"kubernetes.io/projected/afe8c052-ff7e-4892-81fa-8045f69346eb-kube-api-access-hhn2p\") pod \"designate-operator-controller-manager-77fb7bcf5b-xhck5\" (UID: \"afe8c052-ff7e-4892-81fa-8045f69346eb\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.034412 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.057624 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.075215 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.089786 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-jzhjc\" (UID: \"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.089856 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4rgv\" (UniqueName: \"kubernetes.io/projected/de762ea1-08cb-48cd-8e29-2d7523a63ef8-kube-api-access-t4rgv\") pod \"heat-operator-controller-manager-5b4fc86755-f2xqf\" (UID: \"de762ea1-08cb-48cd-8e29-2d7523a63ef8\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.089893 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhvf5\" (UniqueName: \"kubernetes.io/projected/85948289-f8ff-4ccb-8322-17c68d0ca529-kube-api-access-mhvf5\") pod \"glance-operator-controller-manager-8bc4775b5-r4g5l\" (UID: \"85948289-f8ff-4ccb-8322-17c68d0ca529\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.089936 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn7fh\" (UniqueName: \"kubernetes.io/projected/40a3e409-3dbc-4936-819f-c64fe007d584-kube-api-access-hn7fh\") pod \"keystone-operator-controller-manager-59d7dc95cf-lssxn\" (UID: \"40a3e409-3dbc-4936-819f-c64fe007d584\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.090010 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg2rc\" (UniqueName: \"kubernetes.io/projected/53223fa3-3901-4f53-9c6b-18e07485a7ad-kube-api-access-pg2rc\") pod \"horizon-operator-controller-manager-679b4759bb-7ktf6\" (UID: \"53223fa3-3901-4f53-9c6b-18e07485a7ad\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.090132 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sssm\" (UniqueName: \"kubernetes.io/projected/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-kube-api-access-8sssm\") pod \"infra-operator-controller-manager-7d9c7d9477-jzhjc\" (UID: \"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.090171 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phpw8\" (UniqueName: \"kubernetes.io/projected/0dffef5d-ec0f-4e39-a948-c670be2a8521-kube-api-access-phpw8\") pod \"ironic-operator-controller-manager-6f589bc7f7-zfhtk\" (UID: \"0dffef5d-ec0f-4e39-a948-c670be2a8521\") " pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" Sep 29 18:58:47 crc kubenswrapper[4780]: E0929 18:58:47.090532 4780 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 29 18:58:47 crc kubenswrapper[4780]: E0929 18:58:47.090600 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-cert podName:cd74a7d5-36fa-4c53-b9a6-9f9a733791d5 nodeName:}" failed. No retries permitted until 2025-09-29 18:58:47.59057783 +0000 UTC m=+927.538875864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-cert") pod "infra-operator-controller-manager-7d9c7d9477-jzhjc" (UID: "cd74a7d5-36fa-4c53-b9a6-9f9a733791d5") : secret "infra-operator-webhook-server-cert" not found Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.090986 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.103168 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.105105 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.125841 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-np6vs" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.138821 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.139014 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg2rc\" (UniqueName: \"kubernetes.io/projected/53223fa3-3901-4f53-9c6b-18e07485a7ad-kube-api-access-pg2rc\") pod \"horizon-operator-controller-manager-679b4759bb-7ktf6\" (UID: \"53223fa3-3901-4f53-9c6b-18e07485a7ad\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.139821 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4rgv\" (UniqueName: \"kubernetes.io/projected/de762ea1-08cb-48cd-8e29-2d7523a63ef8-kube-api-access-t4rgv\") pod \"heat-operator-controller-manager-5b4fc86755-f2xqf\" (UID: \"de762ea1-08cb-48cd-8e29-2d7523a63ef8\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.139912 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sssm\" (UniqueName: \"kubernetes.io/projected/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-kube-api-access-8sssm\") pod \"infra-operator-controller-manager-7d9c7d9477-jzhjc\" (UID: \"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.145950 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.147909 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.157349 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-pxzc9" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.161453 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhvf5\" (UniqueName: \"kubernetes.io/projected/85948289-f8ff-4ccb-8322-17c68d0ca529-kube-api-access-mhvf5\") pod \"glance-operator-controller-manager-8bc4775b5-r4g5l\" (UID: \"85948289-f8ff-4ccb-8322-17c68d0ca529\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.192310 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phpw8\" (UniqueName: \"kubernetes.io/projected/0dffef5d-ec0f-4e39-a948-c670be2a8521-kube-api-access-phpw8\") pod \"ironic-operator-controller-manager-6f589bc7f7-zfhtk\" (UID: \"0dffef5d-ec0f-4e39-a948-c670be2a8521\") " pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.192354 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcpxs\" (UniqueName: \"kubernetes.io/projected/1b274b66-59c6-49e6-8469-dfaa9d5a85cc-kube-api-access-zcpxs\") pod \"manila-operator-controller-manager-b7cf8cb5f-slhwp\" (UID: \"1b274b66-59c6-49e6-8469-dfaa9d5a85cc\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.192422 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqcrm\" (UniqueName: \"kubernetes.io/projected/28049dad-f386-4b21-b525-63fd463b8c37-kube-api-access-nqcrm\") pod \"mariadb-operator-controller-manager-67bf5bb885-46wcs\" (UID: \"28049dad-f386-4b21-b525-63fd463b8c37\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.192449 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn7fh\" (UniqueName: \"kubernetes.io/projected/40a3e409-3dbc-4936-819f-c64fe007d584-kube-api-access-hn7fh\") pod \"keystone-operator-controller-manager-59d7dc95cf-lssxn\" (UID: \"40a3e409-3dbc-4936-819f-c64fe007d584\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.192922 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.194056 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.201279 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-tg8ln" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.201461 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.215221 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.232808 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.233227 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn7fh\" (UniqueName: \"kubernetes.io/projected/40a3e409-3dbc-4936-819f-c64fe007d584-kube-api-access-hn7fh\") pod \"keystone-operator-controller-manager-59d7dc95cf-lssxn\" (UID: \"40a3e409-3dbc-4936-819f-c64fe007d584\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.258152 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.265025 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.266318 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.280302 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.288149 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wsb5k" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.298960 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crx2l\" (UniqueName: \"kubernetes.io/projected/eabb644f-cfed-402e-8e6c-b98dc6ec30ef-kube-api-access-crx2l\") pod \"nova-operator-controller-manager-79f9fc9fd8-7mc4n\" (UID: \"eabb644f-cfed-402e-8e6c-b98dc6ec30ef\") " pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.299117 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcpxs\" (UniqueName: \"kubernetes.io/projected/1b274b66-59c6-49e6-8469-dfaa9d5a85cc-kube-api-access-zcpxs\") pod \"manila-operator-controller-manager-b7cf8cb5f-slhwp\" (UID: \"1b274b66-59c6-49e6-8469-dfaa9d5a85cc\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.299156 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfks2\" (UniqueName: \"kubernetes.io/projected/6865eded-097c-49c7-a54d-cda27a2adc65-kube-api-access-nfks2\") pod \"neutron-operator-controller-manager-6b96467f46-lfnp5\" (UID: \"6865eded-097c-49c7-a54d-cda27a2adc65\") " pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.299223 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqcrm\" (UniqueName: \"kubernetes.io/projected/28049dad-f386-4b21-b525-63fd463b8c37-kube-api-access-nqcrm\") pod \"mariadb-operator-controller-manager-67bf5bb885-46wcs\" (UID: \"28049dad-f386-4b21-b525-63fd463b8c37\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.300770 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phpw8\" (UniqueName: \"kubernetes.io/projected/0dffef5d-ec0f-4e39-a948-c670be2a8521-kube-api-access-phpw8\") pod \"ironic-operator-controller-manager-6f589bc7f7-zfhtk\" (UID: \"0dffef5d-ec0f-4e39-a948-c670be2a8521\") " pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.329372 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.330622 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.337777 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-6j7vm" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.344112 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.371425 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.371781 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcpxs\" (UniqueName: \"kubernetes.io/projected/1b274b66-59c6-49e6-8469-dfaa9d5a85cc-kube-api-access-zcpxs\") pod \"manila-operator-controller-manager-b7cf8cb5f-slhwp\" (UID: \"1b274b66-59c6-49e6-8469-dfaa9d5a85cc\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.380399 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqcrm\" (UniqueName: \"kubernetes.io/projected/28049dad-f386-4b21-b525-63fd463b8c37-kube-api-access-nqcrm\") pod \"mariadb-operator-controller-manager-67bf5bb885-46wcs\" (UID: \"28049dad-f386-4b21-b525-63fd463b8c37\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.384003 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.397013 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.400971 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w84q\" (UniqueName: \"kubernetes.io/projected/23713df8-910e-453e-a639-cdfc43473071-kube-api-access-7w84q\") pod \"octavia-operator-controller-manager-6fb7d6b8bf-dfxwb\" (UID: \"23713df8-910e-453e-a639-cdfc43473071\") " pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.401077 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfks2\" (UniqueName: \"kubernetes.io/projected/6865eded-097c-49c7-a54d-cda27a2adc65-kube-api-access-nfks2\") pod \"neutron-operator-controller-manager-6b96467f46-lfnp5\" (UID: \"6865eded-097c-49c7-a54d-cda27a2adc65\") " pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.401190 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crx2l\" (UniqueName: \"kubernetes.io/projected/eabb644f-cfed-402e-8e6c-b98dc6ec30ef-kube-api-access-crx2l\") pod \"nova-operator-controller-manager-79f9fc9fd8-7mc4n\" (UID: \"eabb644f-cfed-402e-8e6c-b98dc6ec30ef\") " pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.407510 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nq9vz" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.409688 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.427518 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-87km7"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.429545 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.443583 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nzrqs" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.471055 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.471354 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crx2l\" (UniqueName: \"kubernetes.io/projected/eabb644f-cfed-402e-8e6c-b98dc6ec30ef-kube-api-access-crx2l\") pod \"nova-operator-controller-manager-79f9fc9fd8-7mc4n\" (UID: \"eabb644f-cfed-402e-8e6c-b98dc6ec30ef\") " pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.471785 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfks2\" (UniqueName: \"kubernetes.io/projected/6865eded-097c-49c7-a54d-cda27a2adc65-kube-api-access-nfks2\") pod \"neutron-operator-controller-manager-6b96467f46-lfnp5\" (UID: \"6865eded-097c-49c7-a54d-cda27a2adc65\") " pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.488219 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.484858 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.507162 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-87km7"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.512659 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49z7r\" (UniqueName: \"kubernetes.io/projected/cec435bb-5818-41aa-8177-dfdddc267c00-kube-api-access-49z7r\") pod \"ovn-operator-controller-manager-84c745747f-87km7\" (UID: \"cec435bb-5818-41aa-8177-dfdddc267c00\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.512888 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w84q\" (UniqueName: \"kubernetes.io/projected/23713df8-910e-453e-a639-cdfc43473071-kube-api-access-7w84q\") pod \"octavia-operator-controller-manager-6fb7d6b8bf-dfxwb\" (UID: \"23713df8-910e-453e-a639-cdfc43473071\") " pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.515880 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.544882 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.546440 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.551286 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.555800 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-447rn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.580076 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w84q\" (UniqueName: \"kubernetes.io/projected/23713df8-910e-453e-a639-cdfc43473071-kube-api-access-7w84q\") pod \"octavia-operator-controller-manager-6fb7d6b8bf-dfxwb\" (UID: \"23713df8-910e-453e-a639-cdfc43473071\") " pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.580735 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.628284 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.636538 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.648304 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-jzhjc\" (UID: \"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.648392 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49z7r\" (UniqueName: \"kubernetes.io/projected/cec435bb-5818-41aa-8177-dfdddc267c00-kube-api-access-49z7r\") pod \"ovn-operator-controller-manager-84c745747f-87km7\" (UID: \"cec435bb-5818-41aa-8177-dfdddc267c00\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.648436 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5b6k\" (UniqueName: \"kubernetes.io/projected/dcc90bb2-08d8-448b-85bb-955bfc3a7371-kube-api-access-n5b6k\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh\" (UID: \"dcc90bb2-08d8-448b-85bb-955bfc3a7371\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.648491 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcc90bb2-08d8-448b-85bb-955bfc3a7371-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh\" (UID: \"dcc90bb2-08d8-448b-85bb-955bfc3a7371\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:58:47 crc kubenswrapper[4780]: E0929 18:58:47.648489 4780 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 29 18:58:47 crc kubenswrapper[4780]: E0929 18:58:47.648574 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-cert podName:cd74a7d5-36fa-4c53-b9a6-9f9a733791d5 nodeName:}" failed. No retries permitted until 2025-09-29 18:58:48.648549082 +0000 UTC m=+928.596847126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-cert") pod "infra-operator-controller-manager-7d9c7d9477-jzhjc" (UID: "cd74a7d5-36fa-4c53-b9a6-9f9a733791d5") : secret "infra-operator-webhook-server-cert" not found Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.662118 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.663454 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.669684 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-2xt9n" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.700996 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49z7r\" (UniqueName: \"kubernetes.io/projected/cec435bb-5818-41aa-8177-dfdddc267c00-kube-api-access-49z7r\") pod \"ovn-operator-controller-manager-84c745747f-87km7\" (UID: \"cec435bb-5818-41aa-8177-dfdddc267c00\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.702115 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.703462 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.707423 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jgclm" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.729384 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.740159 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.744713 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.752295 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-jhngx" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.754115 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcc90bb2-08d8-448b-85bb-955bfc3a7371-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh\" (UID: \"dcc90bb2-08d8-448b-85bb-955bfc3a7371\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.754165 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h8x9\" (UniqueName: \"kubernetes.io/projected/ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a-kube-api-access-9h8x9\") pod \"swift-operator-controller-manager-657c6b68c7-625qh\" (UID: \"ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a\") " pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.754213 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npz7\" (UniqueName: \"kubernetes.io/projected/541588f6-d71c-42ca-b4eb-515f5409f2d1-kube-api-access-9npz7\") pod \"placement-operator-controller-manager-598c4c8547-d8s8d\" (UID: \"541588f6-d71c-42ca-b4eb-515f5409f2d1\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.754295 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrrbv\" (UniqueName: \"kubernetes.io/projected/02f6d355-f384-4b36-b518-55ad38e66215-kube-api-access-zrrbv\") pod \"test-operator-controller-manager-6bb97fcf96-szxrn\" (UID: \"02f6d355-f384-4b36-b518-55ad38e66215\") " pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.754314 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5b6k\" (UniqueName: \"kubernetes.io/projected/dcc90bb2-08d8-448b-85bb-955bfc3a7371-kube-api-access-n5b6k\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh\" (UID: \"dcc90bb2-08d8-448b-85bb-955bfc3a7371\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.754340 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf5m4\" (UniqueName: \"kubernetes.io/projected/e6865432-79dd-4823-a42d-bb08417a0f90-kube-api-access-wf5m4\") pod \"telemetry-operator-controller-manager-cb66d6b59-nznlf\" (UID: \"e6865432-79dd-4823-a42d-bb08417a0f90\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" Sep 29 18:58:47 crc kubenswrapper[4780]: E0929 18:58:47.754298 4780 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 18:58:47 crc kubenswrapper[4780]: E0929 18:58:47.754455 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc90bb2-08d8-448b-85bb-955bfc3a7371-cert podName:dcc90bb2-08d8-448b-85bb-955bfc3a7371 nodeName:}" failed. No retries permitted until 2025-09-29 18:58:48.254413032 +0000 UTC m=+928.202711076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcc90bb2-08d8-448b-85bb-955bfc3a7371-cert") pod "openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" (UID: "dcc90bb2-08d8-448b-85bb-955bfc3a7371") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.774756 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.775922 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5b6k\" (UniqueName: \"kubernetes.io/projected/dcc90bb2-08d8-448b-85bb-955bfc3a7371-kube-api-access-n5b6k\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh\" (UID: \"dcc90bb2-08d8-448b-85bb-955bfc3a7371\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.781749 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.793090 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.794391 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.798983 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-jtlrs" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.800979 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.806150 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.826623 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.828502 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.832081 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.832580 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-t6qrt" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.837292 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.847830 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.849447 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.851878 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.852584 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-9f8h4" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.855623 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18dfa3ae-5e34-436b-87b9-f215e898567c-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-s9lgn\" (UID: \"18dfa3ae-5e34-436b-87b9-f215e898567c\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.855699 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrrbv\" (UniqueName: \"kubernetes.io/projected/02f6d355-f384-4b36-b518-55ad38e66215-kube-api-access-zrrbv\") pod \"test-operator-controller-manager-6bb97fcf96-szxrn\" (UID: \"02f6d355-f384-4b36-b518-55ad38e66215\") " pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.855731 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf5m4\" (UniqueName: \"kubernetes.io/projected/e6865432-79dd-4823-a42d-bb08417a0f90-kube-api-access-wf5m4\") pod \"telemetry-operator-controller-manager-cb66d6b59-nznlf\" (UID: \"e6865432-79dd-4823-a42d-bb08417a0f90\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.855764 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bprlx\" (UniqueName: \"kubernetes.io/projected/18dfa3ae-5e34-436b-87b9-f215e898567c-kube-api-access-bprlx\") pod \"openstack-operator-controller-manager-7b7bb8bd67-s9lgn\" (UID: \"18dfa3ae-5e34-436b-87b9-f215e898567c\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.855797 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h8x9\" (UniqueName: \"kubernetes.io/projected/ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a-kube-api-access-9h8x9\") pod \"swift-operator-controller-manager-657c6b68c7-625qh\" (UID: \"ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a\") " pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.855830 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n26h\" (UniqueName: \"kubernetes.io/projected/e6b57f8b-1be2-48b1-be60-30a3583f6052-kube-api-access-7n26h\") pod \"rabbitmq-cluster-operator-manager-79d8469568-j92fn\" (UID: \"e6b57f8b-1be2-48b1-be60-30a3583f6052\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.855859 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npz7\" (UniqueName: \"kubernetes.io/projected/541588f6-d71c-42ca-b4eb-515f5409f2d1-kube-api-access-9npz7\") pod \"placement-operator-controller-manager-598c4c8547-d8s8d\" (UID: \"541588f6-d71c-42ca-b4eb-515f5409f2d1\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.855880 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nrzj\" (UniqueName: \"kubernetes.io/projected/8fceb492-ba01-4e2f-b59b-6557da4e851a-kube-api-access-4nrzj\") pod \"watcher-operator-controller-manager-75756dd4d9-ksn7s\" (UID: \"8fceb492-ba01-4e2f-b59b-6557da4e851a\") " pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.857335 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.879648 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf5m4\" (UniqueName: \"kubernetes.io/projected/e6865432-79dd-4823-a42d-bb08417a0f90-kube-api-access-wf5m4\") pod \"telemetry-operator-controller-manager-cb66d6b59-nznlf\" (UID: \"e6865432-79dd-4823-a42d-bb08417a0f90\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.880858 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrrbv\" (UniqueName: \"kubernetes.io/projected/02f6d355-f384-4b36-b518-55ad38e66215-kube-api-access-zrrbv\") pod \"test-operator-controller-manager-6bb97fcf96-szxrn\" (UID: \"02f6d355-f384-4b36-b518-55ad38e66215\") " pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.885146 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npz7\" (UniqueName: \"kubernetes.io/projected/541588f6-d71c-42ca-b4eb-515f5409f2d1-kube-api-access-9npz7\") pod \"placement-operator-controller-manager-598c4c8547-d8s8d\" (UID: \"541588f6-d71c-42ca-b4eb-515f5409f2d1\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.891965 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h8x9\" (UniqueName: \"kubernetes.io/projected/ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a-kube-api-access-9h8x9\") pod \"swift-operator-controller-manager-657c6b68c7-625qh\" (UID: \"ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a\") " pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.924635 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.926658 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn"] Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.930504 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.956918 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bprlx\" (UniqueName: \"kubernetes.io/projected/18dfa3ae-5e34-436b-87b9-f215e898567c-kube-api-access-bprlx\") pod \"openstack-operator-controller-manager-7b7bb8bd67-s9lgn\" (UID: \"18dfa3ae-5e34-436b-87b9-f215e898567c\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.956998 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n26h\" (UniqueName: \"kubernetes.io/projected/e6b57f8b-1be2-48b1-be60-30a3583f6052-kube-api-access-7n26h\") pod \"rabbitmq-cluster-operator-manager-79d8469568-j92fn\" (UID: \"e6b57f8b-1be2-48b1-be60-30a3583f6052\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.957022 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nrzj\" (UniqueName: \"kubernetes.io/projected/8fceb492-ba01-4e2f-b59b-6557da4e851a-kube-api-access-4nrzj\") pod \"watcher-operator-controller-manager-75756dd4d9-ksn7s\" (UID: \"8fceb492-ba01-4e2f-b59b-6557da4e851a\") " pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" Sep 29 18:58:47 crc kubenswrapper[4780]: I0929 18:58:47.966790 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18dfa3ae-5e34-436b-87b9-f215e898567c-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-s9lgn\" (UID: \"18dfa3ae-5e34-436b-87b9-f215e898567c\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:47 crc kubenswrapper[4780]: E0929 18:58:47.967877 4780 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 29 18:58:47 crc kubenswrapper[4780]: E0929 18:58:47.967941 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18dfa3ae-5e34-436b-87b9-f215e898567c-cert podName:18dfa3ae-5e34-436b-87b9-f215e898567c nodeName:}" failed. No retries permitted until 2025-09-29 18:58:48.467923074 +0000 UTC m=+928.416221118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18dfa3ae-5e34-436b-87b9-f215e898567c-cert") pod "openstack-operator-controller-manager-7b7bb8bd67-s9lgn" (UID: "18dfa3ae-5e34-436b-87b9-f215e898567c") : secret "webhook-server-cert" not found Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.012735 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.016548 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n26h\" (UniqueName: \"kubernetes.io/projected/e6b57f8b-1be2-48b1-be60-30a3583f6052-kube-api-access-7n26h\") pod \"rabbitmq-cluster-operator-manager-79d8469568-j92fn\" (UID: \"e6b57f8b-1be2-48b1-be60-30a3583f6052\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.020476 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bprlx\" (UniqueName: \"kubernetes.io/projected/18dfa3ae-5e34-436b-87b9-f215e898567c-kube-api-access-bprlx\") pod \"openstack-operator-controller-manager-7b7bb8bd67-s9lgn\" (UID: \"18dfa3ae-5e34-436b-87b9-f215e898567c\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.023318 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nrzj\" (UniqueName: \"kubernetes.io/projected/8fceb492-ba01-4e2f-b59b-6557da4e851a-kube-api-access-4nrzj\") pod \"watcher-operator-controller-manager-75756dd4d9-ksn7s\" (UID: \"8fceb492-ba01-4e2f-b59b-6557da4e851a\") " pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.036363 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.083314 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5"] Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.101646 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm"] Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.265631 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l"] Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.265988 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.277597 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcc90bb2-08d8-448b-85bb-955bfc3a7371-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh\" (UID: \"dcc90bb2-08d8-448b-85bb-955bfc3a7371\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.290160 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcc90bb2-08d8-448b-85bb-955bfc3a7371-cert\") pod \"openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh\" (UID: \"dcc90bb2-08d8-448b-85bb-955bfc3a7371\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:58:48 crc kubenswrapper[4780]: W0929 18:58:48.406977 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafe8c052_ff7e_4892_81fa_8045f69346eb.slice/crio-01bd4c46bd5f924ece843168699283cd8b756ec46587704ae90f2e62fe79d696 WatchSource:0}: Error finding container 01bd4c46bd5f924ece843168699283cd8b756ec46587704ae90f2e62fe79d696: Status 404 returned error can't find the container with id 01bd4c46bd5f924ece843168699283cd8b756ec46587704ae90f2e62fe79d696 Sep 29 18:58:48 crc kubenswrapper[4780]: W0929 18:58:48.409450 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8add36ee_ae48_47aa_a1b8_39e26a2b61c4.slice/crio-b3738cebe1a38529e0fa324e8a8c7bc25fefd5e3bb120740ddb2b7e7b64302cd WatchSource:0}: Error finding container b3738cebe1a38529e0fa324e8a8c7bc25fefd5e3bb120740ddb2b7e7b64302cd: Status 404 returned error can't find the container with id b3738cebe1a38529e0fa324e8a8c7bc25fefd5e3bb120740ddb2b7e7b64302cd Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.418488 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.426985 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.497990 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18dfa3ae-5e34-436b-87b9-f215e898567c-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-s9lgn\" (UID: \"18dfa3ae-5e34-436b-87b9-f215e898567c\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:48 crc kubenswrapper[4780]: E0929 18:58:48.499375 4780 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 29 18:58:48 crc kubenswrapper[4780]: E0929 18:58:48.499434 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18dfa3ae-5e34-436b-87b9-f215e898567c-cert podName:18dfa3ae-5e34-436b-87b9-f215e898567c nodeName:}" failed. No retries permitted until 2025-09-29 18:58:49.499412467 +0000 UTC m=+929.447710511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18dfa3ae-5e34-436b-87b9-f215e898567c-cert") pod "openstack-operator-controller-manager-7b7bb8bd67-s9lgn" (UID: "18dfa3ae-5e34-436b-87b9-f215e898567c") : secret "webhook-server-cert" not found Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.712497 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-jzhjc\" (UID: \"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.716308 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6"] Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.719305 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd74a7d5-36fa-4c53-b9a6-9f9a733791d5-cert\") pod \"infra-operator-controller-manager-7d9c7d9477-jzhjc\" (UID: \"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5\") " pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:48 crc kubenswrapper[4780]: W0929 18:58:48.738914 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53223fa3_3901_4f53_9c6b_18e07485a7ad.slice/crio-76ba69e7cbf97fdff55ea49ee3aba34da29a7a12e6697ef2248e239b3e037131 WatchSource:0}: Error finding container 76ba69e7cbf97fdff55ea49ee3aba34da29a7a12e6697ef2248e239b3e037131: Status 404 returned error can't find the container with id 76ba69e7cbf97fdff55ea49ee3aba34da29a7a12e6697ef2248e239b3e037131 Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.879962 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp"] Sep 29 18:58:48 crc kubenswrapper[4780]: W0929 18:58:48.922310 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23713df8_910e_453e_a639_cdfc43473071.slice/crio-b7dc941997a73f51cf02892f019666633bba15aedcb72acc6eb361843f6c796a WatchSource:0}: Error finding container b7dc941997a73f51cf02892f019666633bba15aedcb72acc6eb361843f6c796a: Status 404 returned error can't find the container with id b7dc941997a73f51cf02892f019666633bba15aedcb72acc6eb361843f6c796a Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.923496 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk"] Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.933706 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb"] Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.935931 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:58:48 crc kubenswrapper[4780]: I0929 18:58:48.961262 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.118477 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.124615 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n"] Sep 29 18:58:49 crc kubenswrapper[4780]: W0929 18:58:49.148523 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6865eded_097c_49c7_a54d_cda27a2adc65.slice/crio-94d719ed7d28cd4162d7aa974e0f7ce142d8e85300e9c13af881a9a25b854e63 WatchSource:0}: Error finding container 94d719ed7d28cd4162d7aa974e0f7ce142d8e85300e9c13af881a9a25b854e63: Status 404 returned error can't find the container with id 94d719ed7d28cd4162d7aa974e0f7ce142d8e85300e9c13af881a9a25b854e63 Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.193153 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" event={"ID":"0dffef5d-ec0f-4e39-a948-c670be2a8521","Type":"ContainerStarted","Data":"ab4f953dd59b7f893c2b35ac94579b5520434a11c3c50483b132aa36e9ad4b40"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.196276 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" event={"ID":"53223fa3-3901-4f53-9c6b-18e07485a7ad","Type":"ContainerStarted","Data":"76ba69e7cbf97fdff55ea49ee3aba34da29a7a12e6697ef2248e239b3e037131"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.202241 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" event={"ID":"eabb644f-cfed-402e-8e6c-b98dc6ec30ef","Type":"ContainerStarted","Data":"f99af240cbe36a868c722a2192c2e8d3097deb065888f79922e77ce52e02f8b5"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.205363 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" event={"ID":"f488a5b4-5b60-4e98-9095-5c6b3e7d580b","Type":"ContainerStarted","Data":"72060cbeb1c3c31d2292b7aa70bee89bc83e38e11ff4d030de9eab3fc08e7664"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.211516 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" event={"ID":"23713df8-910e-453e-a639-cdfc43473071","Type":"ContainerStarted","Data":"b7dc941997a73f51cf02892f019666633bba15aedcb72acc6eb361843f6c796a"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.216916 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" event={"ID":"85948289-f8ff-4ccb-8322-17c68d0ca529","Type":"ContainerStarted","Data":"a3be3c8ecf910a17ba626e9188b2bd8f3265a5a0130cb4fcdf495d1cc65978e9"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.220118 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" event={"ID":"de762ea1-08cb-48cd-8e29-2d7523a63ef8","Type":"ContainerStarted","Data":"03fc23951c1f53c158c074a7c54a5ecd4194c510062f5a330a8cbc86bcc15ee8"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.223631 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" event={"ID":"afe8c052-ff7e-4892-81fa-8045f69346eb","Type":"ContainerStarted","Data":"01bd4c46bd5f924ece843168699283cd8b756ec46587704ae90f2e62fe79d696"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.225701 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" event={"ID":"1b274b66-59c6-49e6-8469-dfaa9d5a85cc","Type":"ContainerStarted","Data":"843d091c82be526842a31dca3cc2e519dd5a176af3ec1d68bcb47de24933c8aa"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.227782 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" event={"ID":"8add36ee-ae48-47aa-a1b8-39e26a2b61c4","Type":"ContainerStarted","Data":"b3738cebe1a38529e0fa324e8a8c7bc25fefd5e3bb120740ddb2b7e7b64302cd"} Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.420520 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.430131 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.436303 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.443283 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.450171 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.492699 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.498536 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.506175 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-87km7"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.513612 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh"] Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.521459 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s"] Sep 29 18:58:49 crc kubenswrapper[4780]: W0929 18:58:49.525553 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcec435bb_5818_41aa_8177_dfdddc267c00.slice/crio-80ee4fad13d8b7ab6e73d3c652099df910d442ef9b9e20c29e278cb0acdd9fa8 WatchSource:0}: Error finding container 80ee4fad13d8b7ab6e73d3c652099df910d442ef9b9e20c29e278cb0acdd9fa8: Status 404 returned error can't find the container with id 80ee4fad13d8b7ab6e73d3c652099df910d442ef9b9e20c29e278cb0acdd9fa8 Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.526059 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc"] Sep 29 18:58:49 crc kubenswrapper[4780]: W0929 18:58:49.527679 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6865432_79dd_4823_a42d_bb08417a0f90.slice/crio-e291b4b0e65a8ed9c25f5aa333b7247aae2b83831687718cf8d20c98e9e43f78 WatchSource:0}: Error finding container e291b4b0e65a8ed9c25f5aa333b7247aae2b83831687718cf8d20c98e9e43f78: Status 404 returned error can't find the container with id e291b4b0e65a8ed9c25f5aa333b7247aae2b83831687718cf8d20c98e9e43f78 Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.527805 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18dfa3ae-5e34-436b-87b9-f215e898567c-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-s9lgn\" (UID: \"18dfa3ae-5e34-436b-87b9-f215e898567c\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:49 crc kubenswrapper[4780]: E0929 18:58:49.528890 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-49z7r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-84c745747f-87km7_openstack-operators(cec435bb-5818-41aa-8177-dfdddc267c00): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 18:58:49 crc kubenswrapper[4780]: E0929 18:58:49.532379 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wf5m4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-cb66d6b59-nznlf_openstack-operators(e6865432-79dd-4823-a42d-bb08417a0f90): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 18:58:49 crc kubenswrapper[4780]: E0929 18:58:49.536988 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:89f9e06c633ae852be8d3e3ca581def0a6e9a5b38c0d519f656976c7414b6b97,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:56f155abc1b8734e4a79c7306ba38caf8d2881625f37d2f9c5a5763fa4db7e02,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:29c8cd4f2d853f512e2ecd44f522f28c3aac046a72733365aa5e91667041d62e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:ed896681f0d9720f56bbcb0b7a4f3626ed397e89af919604ca68b42b7b598859,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:712e1c932a90ef5e3c3ee5d5aea591a377da8c4af604ebd8ec399869a61dfbef,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:10fd8489a5bf6f1d781e9226de68356132db78b62269e69d632748cb08fae725,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:73fd28af83ea96cc920d26dba6105ee59f0824234527949884e6ca55b71d7533,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:8b3a90516ba0695cf3198a7b101da770c30c8100cb79f8088b5729e6a50ddd6d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:6d42bcf65422d2de9cd807feb3e8b005de10084b4b8eb340c8a9045644ae7aaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:32a25ac44706b73bff04a89514177b1efd675f0442b295e225f0020555ca6350,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:b19043eac7c653e00da8da9418ae378fdd29698adb1adb4bf5ae7cfc03ba5538,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:c486e00b36ea7698d6a4cd9048a759bad5a8286e4949bbd1f82c3ddb70600b9b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:ef2727f0300fbf3bf15d8ddc409d0fd63e4aac9dd64c86459bd6ff64fc6b9534,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:329aac65ba00c3cf43bb1d5fac8818752f01de90b47719e2a84db4e2fe083292,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:6ce73885ac1ee7c69468efc448eff5deae46502812c5e3d099f771e1cc03345f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:282cc0fcdbb8a688dd62a2499480aae4a36b620f2160d51e6c8269e6cc32d5fc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:d98c0c9d3bdd84daf4b98d45b8bbe2e67a633491897dda7167664a5fa1f0f26e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:4ad1d36fe1c8992e43910fc2d566b991fd73f9b82b1ab860c66858448ff82c00,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:92789eab1b8a91807a5e898cb63478d125ae539eafe63c96049100c6ddeadb04,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:ee9832268e0df5d62c50c5ce171e9ef72a035aa74c718cfbf482e34426d8d15e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:07b4f96f24f32224c13613f85173f9fcc3092b8797ffa47519403d124bfe4c15,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:3a873c95bcb7ae8bd24ff1eb5fe89ac5272a41a3345a7b41d55419b5d66b70e7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:388dbae2f1aae2720e919cc24d10cd577b73b4e4ef7abdc34287bcb8d27ff98f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:d4c1b2496868da3dcca9f4bda0834fcc58d23c21d8ce3c42a68205d02039c487,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:c4414cc2680fb1bacbf99261f759f4ef7401fb2e4953140270bffdab8e002f22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:b9b950a656f1456b3143872c492b0987bf4a9e23bc7c59d843cf50099667b368,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:afd5d6822b86ea0930b2011fede834bb24495995d7baac03363ab61d89f07a22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:665d7a25dfc959ec5448d5ba6b430792ebde1be1580ea6809e9b3b4f94184b3f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:499c6d82390ee2dbb91628d2e42671406372fb603d697685a04145cf6dd8d0ab,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:da2736bc98bfe340e86234523d4c00220f6f79add271900981cf4ad9f4c5ee51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:4df8dad8a5fb4805a0424cbc0b8df666b9a06b76c64f26e186f3b9e8efe6cd95,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:65c16453b5b7bb113646ffce0be26138e89eecbf6dd1582cdfe76af7f5dc62cf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:ce968dce2209ec5114772b4b73ed16c0a25988637372f2afbfac080cc6f1e378,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:b7823eaacf55280cdf3f1bede4f40bf49fdbf9ba9f3f5ba64b0abedede601c8f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:605206d967ffaa20156eb07a645654cd3e0f880bb0eefbb2b5e1e749b169f148,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:9470db6caf5102cf37ddb1f137f17b05ef7119f174f4189beb4839ef7f65730c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:34e84da4ae7e5d65931cbefcda84fd8fdc93271ec466adf1a9040b67a3af176a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:b301b17c31e47733a8a232773427ce3cb50433a3aa09d4a5bd998b1aeb5e5530,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:d642c35c0f9d3acf31987c028f1d4d4fdf7b49e1d6cbcd73268c12b3d6e14b86,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:922eb0799ab36a91aa95abe52565dc60db807457dbf8c651b30e06b9e8aebcd4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:cd01e9605ab513458a6813e38d37fbfde1a91388cc5c00962203dbcbdc285e79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:dd35c22b17730cbca8547ea98459f182939462c8dc3465d21335a377018937de,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:0e0e2e48a41d5417f1d6a4407e63d443611b7eacd66e27f561c9eedf3e5a66c5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:735bd24219fdb5f21c31313a5bc685364f45c004fb5e8af634984c147060d4e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:35b5554efae34f2c25a2d274c78bdaecf3d4ce949fa61c692835ee54cdfc6d74,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:01b93ab0d87482b9a1fd46706771974743dea1ca74f5fcc3de4a560f7cfc033b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:87471fbe3ba77b7115096f4fef8f5a9e1468cbd5bf6060c09785a60f9107a717,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:947dcc46173064939cba252d5db34eb6ddd05eb0af7afd762beebe77e9a72c6e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:8498ed720d02ce4e7045f7eb0051b138274cddba9b1e443d11e413da3474d3a3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:2cb054830655a6af5fc6848360618676d24fd9cf15078c0b9855e09d05733eec,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:0f5f8f560cd3b4951f7e8e67ef570575435b4c6915658cbb66f32a201776078b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:7055e8d7b7d72ce697c6077be14c525c019d186002f04765b90a14c82e01cc7c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:d2cd7a21461b4b569d93a63d57761f437cf6bd0847d69a3a65f64d400c7cca6d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:432c0c6f36a5e4e4db394771f7dc72f3bf9e5060dc4220f781d3c5050cc17f0d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:3ff379a74cc15352bfa25605dbb1a5f4250620e8364bf87ed2f3d5c17e6a8b26,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:c67a7bba2fc9351c302369b590473a737bab20d0982d227756fe1fa0bc1c8773,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:50c613d159667a26ba4bfb7aebf157b8db8919c815a866438b1d2700231a508e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:f3d3d7a7c83926a09714199406bfe8070e6be5055cbfbf00aa37f47e1e5e9bc9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:e9b3260907b0e417bb779a7d513a2639734cbbf792e77c61e05e760d06978f4a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:1aa6a76e67f2d91ee45472741238b5d4ab53f9bcb94db678c7ae92e1af28899d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:80b8547cf5821a4eb5461d1ac14edbc700ef03926268af960bf511647de027af,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content@sha256:7086442096db5ceb68e22bcce00688072957fdad07d00d8f18eb0506ad958923,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:bf42dfd2e225818662aa28c4bb23204dc47b2b91127ca0e49b085baa1ea7609d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:bd08ffdb4dcfd436200d846d15b2bdcc14122fa43adfea4c0980a087a18f9e3e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:2d1e733d24df6ca02636374147f801a0ec1509f8db2f9ad8c739b3f2341815fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:c08ba2a0df4cc18e615b25c329e9c74153709b435c032c38502ec78ba297c5fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:b6cdafc7722def5b63ef4f00251e10aca93ef82628b21e88925c3d4b49277316,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:0a0bbe43e3c266dfeb40a09036f76393dc70377b636724c130a29c434f6d6c82,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:7387b628d7cfb3ff349e0df6f11f41ae7fdb0e2d55844944896af02a81ac7cf7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:9a3671dee1752ebe3639a0b16de95d29e779f1629d563e0585d65b9792542fc9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:b2782fe02b1438d68308a5847b0628f0971b5bb8bb0a4d20fe15176fa75bd33f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:7118cc3a695fead2a8bab14c8ace018ed7a5ba23ef347bf4ead44219e8467866,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:793a836e17b07b0e0a4e8d3177fd04724e1e058fca275ef434abe60a2e444a79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:713d74dc81859344bdcae68a9f7a954146c3e68cfa819518a58cce9e896298c8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:e39be536015777a1b0df8ac863f354046b2b15fee8482abd37d2fa59d8074208,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:28e209c66bc86354495ac7793f2e66db0e8540485590742ab1b53a7cf24cb4fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:d117753b6cff563084bf771173ea89a2ce00854efdc45447667e5d230c60c363,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:f1aac0a57d83b085c37cf75ce0a56f85b68353b1a88740b64a5858bc93dba36b,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5b6k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh_openstack-operators(dcc90bb2-08d8-448b-85bb-955bfc3a7371): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 18:58:49 crc kubenswrapper[4780]: W0929 18:58:49.539032 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28049dad_f386_4b21_b525_63fd463b8c37.slice/crio-bd37c0c94cd14c485c191d002b8ab204fdd879ad08b656b1fd1009fd20b19265 WatchSource:0}: Error finding container bd37c0c94cd14c485c191d002b8ab204fdd879ad08b656b1fd1009fd20b19265: Status 404 returned error can't find the container with id bd37c0c94cd14c485c191d002b8ab204fdd879ad08b656b1fd1009fd20b19265 Sep 29 18:58:49 crc kubenswrapper[4780]: W0929 18:58:49.540243 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fceb492_ba01_4e2f_b59b_6557da4e851a.slice/crio-b9293de899c36d8490265fdd34e3ee1fc32c7061eabf80ecc87226b448f0e6ea WatchSource:0}: Error finding container b9293de899c36d8490265fdd34e3ee1fc32c7061eabf80ecc87226b448f0e6ea: Status 404 returned error can't find the container with id b9293de899c36d8490265fdd34e3ee1fc32c7061eabf80ecc87226b448f0e6ea Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.540553 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18dfa3ae-5e34-436b-87b9-f215e898567c-cert\") pod \"openstack-operator-controller-manager-7b7bb8bd67-s9lgn\" (UID: \"18dfa3ae-5e34-436b-87b9-f215e898567c\") " pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:49 crc kubenswrapper[4780]: E0929 18:58:49.545638 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4nrzj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75756dd4d9-ksn7s_openstack-operators(8fceb492-ba01-4e2f-b59b-6557da4e851a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 18:58:49 crc kubenswrapper[4780]: E0929 18:58:49.546035 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nqcrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf5bb885-46wcs_openstack-operators(28049dad-f386-4b21-b525-63fd463b8c37): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 18:58:49 crc kubenswrapper[4780]: W0929 18:58:49.546205 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd74a7d5_36fa_4c53_b9a6_9f9a733791d5.slice/crio-0de8c4e6d8e70600c62106bf7b4e0e7dcc55d310b732699a39c37144378c961c WatchSource:0}: Error finding container 0de8c4e6d8e70600c62106bf7b4e0e7dcc55d310b732699a39c37144378c961c: Status 404 returned error can't find the container with id 0de8c4e6d8e70600c62106bf7b4e0e7dcc55d310b732699a39c37144378c961c Sep 29 18:58:49 crc kubenswrapper[4780]: E0929 18:58:49.549575 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sssm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7d9c7d9477-jzhjc_openstack-operators(cd74a7d5-36fa-4c53-b9a6-9f9a733791d5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 29 18:58:49 crc kubenswrapper[4780]: I0929 18:58:49.588364 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:49 crc kubenswrapper[4780]: E0929 18:58:49.770520 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" podUID="dcc90bb2-08d8-448b-85bb-955bfc3a7371" Sep 29 18:58:49 crc kubenswrapper[4780]: E0929 18:58:49.816628 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" podUID="e6865432-79dd-4823-a42d-bb08417a0f90" Sep 29 18:58:49 crc kubenswrapper[4780]: E0929 18:58:49.817150 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" podUID="cec435bb-5818-41aa-8177-dfdddc267c00" Sep 29 18:58:50 crc kubenswrapper[4780]: E0929 18:58:50.019906 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" podUID="8fceb492-ba01-4e2f-b59b-6557da4e851a" Sep 29 18:58:50 crc kubenswrapper[4780]: E0929 18:58:50.086156 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" podUID="cd74a7d5-36fa-4c53-b9a6-9f9a733791d5" Sep 29 18:58:50 crc kubenswrapper[4780]: E0929 18:58:50.106768 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" podUID="28049dad-f386-4b21-b525-63fd463b8c37" Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.144849 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn"] Sep 29 18:58:50 crc kubenswrapper[4780]: W0929 18:58:50.186937 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18dfa3ae_5e34_436b_87b9_f215e898567c.slice/crio-fa74963151efcaa463c382c090663223f4555b42ab15bde829010c9e248df326 WatchSource:0}: Error finding container fa74963151efcaa463c382c090663223f4555b42ab15bde829010c9e248df326: Status 404 returned error can't find the container with id fa74963151efcaa463c382c090663223f4555b42ab15bde829010c9e248df326 Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.273420 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn" event={"ID":"e6b57f8b-1be2-48b1-be60-30a3583f6052","Type":"ContainerStarted","Data":"5a067322fceca2e39e2036107146db4b89be74ff4ce4cd9f3cc15ca821492d36"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.293942 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" event={"ID":"40a3e409-3dbc-4936-819f-c64fe007d584","Type":"ContainerStarted","Data":"7266b7f354b32cec5de6b8aa07a90eafc3c828063fac97bbd0c8a03d02fc989c"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.296724 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" event={"ID":"28049dad-f386-4b21-b525-63fd463b8c37","Type":"ContainerStarted","Data":"dc9ed8e230b75ea6e0b2cccc4786424c59182ab364f7707f350dda2e7cc249a9"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.296758 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" event={"ID":"28049dad-f386-4b21-b525-63fd463b8c37","Type":"ContainerStarted","Data":"bd37c0c94cd14c485c191d002b8ab204fdd879ad08b656b1fd1009fd20b19265"} Sep 29 18:58:50 crc kubenswrapper[4780]: E0929 18:58:50.299386 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" podUID="28049dad-f386-4b21-b525-63fd463b8c37" Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.304352 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" event={"ID":"8fceb492-ba01-4e2f-b59b-6557da4e851a","Type":"ContainerStarted","Data":"defaae1ab58ac84351a496b154207ffce9237515b594d3c2674f1b43a85cfa02"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.304462 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" event={"ID":"8fceb492-ba01-4e2f-b59b-6557da4e851a","Type":"ContainerStarted","Data":"b9293de899c36d8490265fdd34e3ee1fc32c7061eabf80ecc87226b448f0e6ea"} Sep 29 18:58:50 crc kubenswrapper[4780]: E0929 18:58:50.306383 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" podUID="8fceb492-ba01-4e2f-b59b-6557da4e851a" Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.326121 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" event={"ID":"dcc90bb2-08d8-448b-85bb-955bfc3a7371","Type":"ContainerStarted","Data":"5ec1a28c0cea3f7b10580f10417fa7237ea744b5b4a3cf04ceaf12f70dffb5f9"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.326177 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" event={"ID":"dcc90bb2-08d8-448b-85bb-955bfc3a7371","Type":"ContainerStarted","Data":"50d7a4928d4fe04e3e0a15de47ed8473e408c705b1f1521046c7a98c98cd0cb2"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.334957 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" event={"ID":"e6865432-79dd-4823-a42d-bb08417a0f90","Type":"ContainerStarted","Data":"ddddeaec2a115cd1343eaa5e2247d01f770e87903d9758213d98d6f70bf42a50"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.335014 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" event={"ID":"e6865432-79dd-4823-a42d-bb08417a0f90","Type":"ContainerStarted","Data":"e291b4b0e65a8ed9c25f5aa333b7247aae2b83831687718cf8d20c98e9e43f78"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.371653 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" event={"ID":"541588f6-d71c-42ca-b4eb-515f5409f2d1","Type":"ContainerStarted","Data":"54ac2101e61148cd04c186b50d8088bbae5775dacfc44f5f7355160145d6e8a4"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.375723 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" event={"ID":"02f6d355-f384-4b36-b518-55ad38e66215","Type":"ContainerStarted","Data":"65e4f55e1c331cee433c8c36a141af71849012994c294cd922cfb252e25db835"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.379194 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" event={"ID":"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5","Type":"ContainerStarted","Data":"902b456550238735b8c15aeded6b53bd20fc224c8e69b77dc166ef800b52bafe"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.379239 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" event={"ID":"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5","Type":"ContainerStarted","Data":"0de8c4e6d8e70600c62106bf7b4e0e7dcc55d310b732699a39c37144378c961c"} Sep 29 18:58:50 crc kubenswrapper[4780]: E0929 18:58:50.388485 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" podUID="dcc90bb2-08d8-448b-85bb-955bfc3a7371" Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.389265 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" event={"ID":"6865eded-097c-49c7-a54d-cda27a2adc65","Type":"ContainerStarted","Data":"94d719ed7d28cd4162d7aa974e0f7ce142d8e85300e9c13af881a9a25b854e63"} Sep 29 18:58:50 crc kubenswrapper[4780]: E0929 18:58:50.389942 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" podUID="e6865432-79dd-4823-a42d-bb08417a0f90" Sep 29 18:58:50 crc kubenswrapper[4780]: E0929 18:58:50.394826 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" podUID="cd74a7d5-36fa-4c53-b9a6-9f9a733791d5" Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.400505 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" event={"ID":"ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a","Type":"ContainerStarted","Data":"eb9bd8491bb29d1a92cd033a348251d5a7c0ccc643be98440bda1e1d16df3b8a"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.420844 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" event={"ID":"cec435bb-5818-41aa-8177-dfdddc267c00","Type":"ContainerStarted","Data":"a4b49321934803f3e9b802d9889cda47defe394a3d4781e648722f6c3c839720"} Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.420929 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" event={"ID":"cec435bb-5818-41aa-8177-dfdddc267c00","Type":"ContainerStarted","Data":"80ee4fad13d8b7ab6e73d3c652099df910d442ef9b9e20c29e278cb0acdd9fa8"} Sep 29 18:58:50 crc kubenswrapper[4780]: E0929 18:58:50.427171 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" podUID="cec435bb-5818-41aa-8177-dfdddc267c00" Sep 29 18:58:50 crc kubenswrapper[4780]: I0929 18:58:50.440467 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" event={"ID":"18dfa3ae-5e34-436b-87b9-f215e898567c","Type":"ContainerStarted","Data":"fa74963151efcaa463c382c090663223f4555b42ab15bde829010c9e248df326"} Sep 29 18:58:51 crc kubenswrapper[4780]: I0929 18:58:51.481920 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" event={"ID":"18dfa3ae-5e34-436b-87b9-f215e898567c","Type":"ContainerStarted","Data":"02acb6e33460f17456a22a4bdffb5be7dce61754fc6994061f0ef8332060da78"} Sep 29 18:58:51 crc kubenswrapper[4780]: I0929 18:58:51.482419 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" event={"ID":"18dfa3ae-5e34-436b-87b9-f215e898567c","Type":"ContainerStarted","Data":"fe00365b66163beed8c0620b02a3271cd2964672ef580332db655a85e272c666"} Sep 29 18:58:51 crc kubenswrapper[4780]: I0929 18:58:51.482440 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:58:51 crc kubenswrapper[4780]: E0929 18:58:51.485965 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" podUID="cd74a7d5-36fa-4c53-b9a6-9f9a733791d5" Sep 29 18:58:51 crc kubenswrapper[4780]: E0929 18:58:51.486417 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" podUID="cec435bb-5818-41aa-8177-dfdddc267c00" Sep 29 18:58:51 crc kubenswrapper[4780]: E0929 18:58:51.486472 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" podUID="8fceb492-ba01-4e2f-b59b-6557da4e851a" Sep 29 18:58:51 crc kubenswrapper[4780]: E0929 18:58:51.486509 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" podUID="28049dad-f386-4b21-b525-63fd463b8c37" Sep 29 18:58:51 crc kubenswrapper[4780]: E0929 18:58:51.489648 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" podUID="e6865432-79dd-4823-a42d-bb08417a0f90" Sep 29 18:58:51 crc kubenswrapper[4780]: E0929 18:58:51.489710 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" podUID="dcc90bb2-08d8-448b-85bb-955bfc3a7371" Sep 29 18:58:51 crc kubenswrapper[4780]: I0929 18:58:51.628655 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" podStartSLOduration=4.628624842 podStartE2EDuration="4.628624842s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 18:58:51.617792342 +0000 UTC m=+931.566090386" watchObservedRunningTime="2025-09-29 18:58:51.628624842 +0000 UTC m=+931.576922886" Sep 29 18:58:59 crc kubenswrapper[4780]: I0929 18:58:59.595297 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7b7bb8bd67-s9lgn" Sep 29 18:59:01 crc kubenswrapper[4780]: I0929 18:59:01.575493 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" event={"ID":"8add36ee-ae48-47aa-a1b8-39e26a2b61c4","Type":"ContainerStarted","Data":"dcb4eb30c649b70635ecbc8e09a7dd47b6eb3bec6002f35ae6a304cdb5dd0a26"} Sep 29 18:59:01 crc kubenswrapper[4780]: I0929 18:59:01.587490 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" event={"ID":"85948289-f8ff-4ccb-8322-17c68d0ca529","Type":"ContainerStarted","Data":"0da29958f3d7647d28ba56d5ab3578d5b7a2237a5724b3f036dfc4d7537e9c39"} Sep 29 18:59:01 crc kubenswrapper[4780]: I0929 18:59:01.596400 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" event={"ID":"02f6d355-f384-4b36-b518-55ad38e66215","Type":"ContainerStarted","Data":"68d86290dcaa6c5e05bc0c41e416a18e57a181e232e5b9d21f34242efbe2b9ed"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.639124 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" event={"ID":"1b274b66-59c6-49e6-8469-dfaa9d5a85cc","Type":"ContainerStarted","Data":"cf7f457f9da3ed1b447f6a10259df26f21fd6708050db83df2a414f29a048d1e"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.658421 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" event={"ID":"eabb644f-cfed-402e-8e6c-b98dc6ec30ef","Type":"ContainerStarted","Data":"2d376a193189d0f81d4740df249996c3b2b369465e6a903dcce03fee8891bfe4"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.671202 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" event={"ID":"ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a","Type":"ContainerStarted","Data":"d4f014070b1f8aee86f69a7cf56a9b37382192633a25c0ff3a6edd0c374922e6"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.680638 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" event={"ID":"85948289-f8ff-4ccb-8322-17c68d0ca529","Type":"ContainerStarted","Data":"7bf0609311602ec67d53400874bfda1471b1bdf5d1b8a7f08ff97adb3c24a68f"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.687217 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.696711 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" event={"ID":"53223fa3-3901-4f53-9c6b-18e07485a7ad","Type":"ContainerStarted","Data":"7bdc2d15bdc47dee31d0499b01b01de26c14c55f59ac00c58a454dcbdad074cf"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.698528 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" event={"ID":"de762ea1-08cb-48cd-8e29-2d7523a63ef8","Type":"ContainerStarted","Data":"f931d2af052e1cac89ec3c2cc82e6f787f7b15456c28866b96edf7603256a85e"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.702033 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" event={"ID":"afe8c052-ff7e-4892-81fa-8045f69346eb","Type":"ContainerStarted","Data":"cf3e8957eaf8111bdbe0618ec4eb83b2803bbfd9f8e771a5f9990f905b1e93a4"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.704933 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" event={"ID":"f488a5b4-5b60-4e98-9095-5c6b3e7d580b","Type":"ContainerStarted","Data":"9bc33ed1c0890aa73d222a258a239cb39329b0c3e1cf796d4122d2f8753e3dba"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.719717 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" podStartSLOduration=4.078029326 podStartE2EDuration="16.719695646s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:48.498710227 +0000 UTC m=+928.447008261" lastFinishedPulling="2025-09-29 18:59:01.140376537 +0000 UTC m=+941.088674581" observedRunningTime="2025-09-29 18:59:02.713610732 +0000 UTC m=+942.661908776" watchObservedRunningTime="2025-09-29 18:59:02.719695646 +0000 UTC m=+942.667993690" Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.720377 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" event={"ID":"541588f6-d71c-42ca-b4eb-515f5409f2d1","Type":"ContainerStarted","Data":"0a0af2d34dc540403e938b5f8ba14efbcde0a39a5cb1caceedc5fe5419fd6d6b"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.743372 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" event={"ID":"6865eded-097c-49c7-a54d-cda27a2adc65","Type":"ContainerStarted","Data":"9e5305fc390630559f74f0f0c45f3e7b49a3f88d9e2233f60ccb093b222c3f77"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.799509 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn" event={"ID":"e6b57f8b-1be2-48b1-be60-30a3583f6052","Type":"ContainerStarted","Data":"3681548bd40393d98dcd4a4d040095fe8ae0dc785b49d115e7561c2c127e21b5"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.799743 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" event={"ID":"23713df8-910e-453e-a639-cdfc43473071","Type":"ContainerStarted","Data":"352e2b34c51153010bd24b7099cfe56875916507718509a7815b5a9981fb8633"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.808040 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-j92fn" podStartSLOduration=4.12549562 podStartE2EDuration="15.808012813s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.506113885 +0000 UTC m=+929.454411929" lastFinishedPulling="2025-09-29 18:59:01.188631078 +0000 UTC m=+941.136929122" observedRunningTime="2025-09-29 18:59:02.800323264 +0000 UTC m=+942.748621298" watchObservedRunningTime="2025-09-29 18:59:02.808012813 +0000 UTC m=+942.756310857" Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.819310 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" event={"ID":"0dffef5d-ec0f-4e39-a948-c670be2a8521","Type":"ContainerStarted","Data":"c7196547a8021cca03bc14c3307071458c854e4f457e525078ff4ed9658a988a"} Sep 29 18:59:02 crc kubenswrapper[4780]: I0929 18:59:02.831537 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" event={"ID":"40a3e409-3dbc-4936-819f-c64fe007d584","Type":"ContainerStarted","Data":"ae20fb1c2eaa0061f182cbe35768aa80e84a738024ef352c820997fcff2588e7"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.223023 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.223107 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.223166 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.223966 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b940a355395049d621b81f1ec2d095c7832b21f04570b0b8f54122a46e556f20"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.224085 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://b940a355395049d621b81f1ec2d095c7832b21f04570b0b8f54122a46e556f20" gracePeriod=600 Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.845495 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" event={"ID":"afe8c052-ff7e-4892-81fa-8045f69346eb","Type":"ContainerStarted","Data":"79e17250f12d1e3b28aaad067e4bba9d8c8546b93959c19965d82bf024122685"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.846075 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.860977 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" event={"ID":"53223fa3-3901-4f53-9c6b-18e07485a7ad","Type":"ContainerStarted","Data":"1b3dfce7c53e774186ca2a9d7c0c330ee5393878cd4f2daec790d0989f2ecf5a"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.861136 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.873770 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" podStartSLOduration=5.143066462 podStartE2EDuration="17.87375417s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:48.40970228 +0000 UTC m=+928.358000324" lastFinishedPulling="2025-09-29 18:59:01.140389988 +0000 UTC m=+941.088688032" observedRunningTime="2025-09-29 18:59:03.87338782 +0000 UTC m=+943.821685864" watchObservedRunningTime="2025-09-29 18:59:03.87375417 +0000 UTC m=+943.822052214" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.879999 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" event={"ID":"f488a5b4-5b60-4e98-9095-5c6b3e7d580b","Type":"ContainerStarted","Data":"c9ca21c6ed1fe92608dc68a5aafce68541ab623197b903e7d3370a72e205af56"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.880766 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.883764 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" event={"ID":"1b274b66-59c6-49e6-8469-dfaa9d5a85cc","Type":"ContainerStarted","Data":"a5f5caf1d3bf0185b60844316c3d3b711fed4bdeae231ea91d0f4bc38773d02b"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.884424 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.886692 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" event={"ID":"40a3e409-3dbc-4936-819f-c64fe007d584","Type":"ContainerStarted","Data":"44ae3262d9bef54e79817773f6b1d741cde90434ef19d9e7415f0bd49c982fba"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.886870 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.889307 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" event={"ID":"de762ea1-08cb-48cd-8e29-2d7523a63ef8","Type":"ContainerStarted","Data":"74c9c183e67bcda860394f4b5c8c666bff4a7ca5bf36a5eff955c9d3da6af9a0"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.889392 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.891270 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" event={"ID":"ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a","Type":"ContainerStarted","Data":"4ff7105bab9b2ecb268623b179508a49412df4f2a83b5025ec57ff8ecd0b61af"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.891807 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.898748 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" event={"ID":"6865eded-097c-49c7-a54d-cda27a2adc65","Type":"ContainerStarted","Data":"558c8405d858e5fd3703b866f5f9cd9640fa4ffb69259c353ce3d4f9ec948910"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.899265 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.899395 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" podStartSLOduration=5.503631113 podStartE2EDuration="17.899372123s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:48.745887673 +0000 UTC m=+928.694185727" lastFinishedPulling="2025-09-29 18:59:01.141628693 +0000 UTC m=+941.089926737" observedRunningTime="2025-09-29 18:59:03.8957733 +0000 UTC m=+943.844071344" watchObservedRunningTime="2025-09-29 18:59:03.899372123 +0000 UTC m=+943.847670167" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.901511 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" event={"ID":"0dffef5d-ec0f-4e39-a948-c670be2a8521","Type":"ContainerStarted","Data":"af5ecc1604f4a38d9cd39488e80fd9de1cda8b530ec28128a2f1512de9d1881c"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.901717 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.909080 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" event={"ID":"02f6d355-f384-4b36-b518-55ad38e66215","Type":"ContainerStarted","Data":"3e84299de05f8668c6632f59631edd5d4349cad813cdc1481ec511ce9afc0057"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.909245 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.913293 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" event={"ID":"eabb644f-cfed-402e-8e6c-b98dc6ec30ef","Type":"ContainerStarted","Data":"ca8b0be7a323c451be305bb589972e7b5c49a62b3592410251088bac047cdc0f"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.913536 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.920034 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" podStartSLOduration=5.230540642 podStartE2EDuration="16.920006824s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.452309015 +0000 UTC m=+929.400607069" lastFinishedPulling="2025-09-29 18:59:01.141775217 +0000 UTC m=+941.090073251" observedRunningTime="2025-09-29 18:59:03.917368689 +0000 UTC m=+943.865666733" watchObservedRunningTime="2025-09-29 18:59:03.920006824 +0000 UTC m=+943.868304868" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.947473 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" podStartSLOduration=5.749541853 podStartE2EDuration="17.94744756s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:48.957432238 +0000 UTC m=+928.905730282" lastFinishedPulling="2025-09-29 18:59:01.155337945 +0000 UTC m=+941.103635989" observedRunningTime="2025-09-29 18:59:03.943910928 +0000 UTC m=+943.892208982" watchObservedRunningTime="2025-09-29 18:59:03.94744756 +0000 UTC m=+943.895745604" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.954386 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="b940a355395049d621b81f1ec2d095c7832b21f04570b0b8f54122a46e556f20" exitCode=0 Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.963162 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.963225 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"b940a355395049d621b81f1ec2d095c7832b21f04570b0b8f54122a46e556f20"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.963256 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"f026a57b468a10b5696a1d13800dd6d4186b4cd22425cdfb1197806a9210b5dc"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.963273 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" event={"ID":"23713df8-910e-453e-a639-cdfc43473071","Type":"ContainerStarted","Data":"27b984980f6621f55603440afcfcca95c6a890386c96a39306a2fda0d41fd795"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.963294 4780 scope.go:117] "RemoveContainer" containerID="f98e9b0b044c5602c6337b54b65a812f8f898d93f6aec3d809843fc6e333379d" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.965735 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" event={"ID":"541588f6-d71c-42ca-b4eb-515f5409f2d1","Type":"ContainerStarted","Data":"4d7a9c6900fd2ea258ddd51166c0235bf5c94500af19b80d0bcd7d26db8a43b6"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.966575 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.974883 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" podStartSLOduration=6.329860884 podStartE2EDuration="17.974855674s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.496668484 +0000 UTC m=+929.444966528" lastFinishedPulling="2025-09-29 18:59:01.141663274 +0000 UTC m=+941.089961318" observedRunningTime="2025-09-29 18:59:03.970830889 +0000 UTC m=+943.919128943" watchObservedRunningTime="2025-09-29 18:59:03.974855674 +0000 UTC m=+943.923153728" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.981753 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" event={"ID":"8add36ee-ae48-47aa-a1b8-39e26a2b61c4","Type":"ContainerStarted","Data":"3299fca0e5fb6e26935d6809a14b88d5f55900a1391b34c0ffc58fff0627adc4"} Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.981802 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" Sep 29 18:59:03 crc kubenswrapper[4780]: I0929 18:59:03.996432 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" podStartSLOduration=5.771687726 podStartE2EDuration="17.996405491s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:48.916940569 +0000 UTC m=+928.865238613" lastFinishedPulling="2025-09-29 18:59:01.141658334 +0000 UTC m=+941.089956378" observedRunningTime="2025-09-29 18:59:03.992237632 +0000 UTC m=+943.940535676" watchObservedRunningTime="2025-09-29 18:59:03.996405491 +0000 UTC m=+943.944703535" Sep 29 18:59:04 crc kubenswrapper[4780]: I0929 18:59:04.036370 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" podStartSLOduration=5.109453761 podStartE2EDuration="18.036343964s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:48.21474864 +0000 UTC m=+928.163046684" lastFinishedPulling="2025-09-29 18:59:01.141638843 +0000 UTC m=+941.089936887" observedRunningTime="2025-09-29 18:59:04.02990456 +0000 UTC m=+943.978202604" watchObservedRunningTime="2025-09-29 18:59:04.036343964 +0000 UTC m=+943.984642018" Sep 29 18:59:04 crc kubenswrapper[4780]: I0929 18:59:04.059598 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" podStartSLOduration=5.049716235 podStartE2EDuration="17.059573409s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.13040263 +0000 UTC m=+929.078700674" lastFinishedPulling="2025-09-29 18:59:01.140259804 +0000 UTC m=+941.088557848" observedRunningTime="2025-09-29 18:59:04.052490357 +0000 UTC m=+944.000788401" watchObservedRunningTime="2025-09-29 18:59:04.059573409 +0000 UTC m=+944.007871453" Sep 29 18:59:04 crc kubenswrapper[4780]: I0929 18:59:04.094958 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" podStartSLOduration=5.900884895 podStartE2EDuration="18.094939742s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:48.941172013 +0000 UTC m=+928.889470057" lastFinishedPulling="2025-09-29 18:59:01.13522686 +0000 UTC m=+941.083524904" observedRunningTime="2025-09-29 18:59:04.077015089 +0000 UTC m=+944.025313143" watchObservedRunningTime="2025-09-29 18:59:04.094939742 +0000 UTC m=+944.043237796" Sep 29 18:59:04 crc kubenswrapper[4780]: I0929 18:59:04.112800 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" podStartSLOduration=5.468934255 podStartE2EDuration="17.112777512s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.496286013 +0000 UTC m=+929.444584057" lastFinishedPulling="2025-09-29 18:59:01.14012927 +0000 UTC m=+941.088427314" observedRunningTime="2025-09-29 18:59:04.112639858 +0000 UTC m=+944.060937902" watchObservedRunningTime="2025-09-29 18:59:04.112777512 +0000 UTC m=+944.061075546" Sep 29 18:59:04 crc kubenswrapper[4780]: I0929 18:59:04.116069 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" podStartSLOduration=5.3969334 podStartE2EDuration="18.116057246s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:48.421235311 +0000 UTC m=+928.369533355" lastFinishedPulling="2025-09-29 18:59:01.140359157 +0000 UTC m=+941.088657201" observedRunningTime="2025-09-29 18:59:04.09383952 +0000 UTC m=+944.042137564" watchObservedRunningTime="2025-09-29 18:59:04.116057246 +0000 UTC m=+944.064355290" Sep 29 18:59:04 crc kubenswrapper[4780]: I0929 18:59:04.143552 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" podStartSLOduration=5.188458518 podStartE2EDuration="17.143533013s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.180073882 +0000 UTC m=+929.128371926" lastFinishedPulling="2025-09-29 18:59:01.135148377 +0000 UTC m=+941.083446421" observedRunningTime="2025-09-29 18:59:04.126288429 +0000 UTC m=+944.074586473" watchObservedRunningTime="2025-09-29 18:59:04.143533013 +0000 UTC m=+944.091831057" Sep 29 18:59:04 crc kubenswrapper[4780]: I0929 18:59:04.169399 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" podStartSLOduration=4.930934025 podStartE2EDuration="17.169374592s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:48.93232711 +0000 UTC m=+928.880625154" lastFinishedPulling="2025-09-29 18:59:01.170767677 +0000 UTC m=+941.119065721" observedRunningTime="2025-09-29 18:59:04.167277902 +0000 UTC m=+944.115575966" watchObservedRunningTime="2025-09-29 18:59:04.169374592 +0000 UTC m=+944.117672636" Sep 29 18:59:04 crc kubenswrapper[4780]: I0929 18:59:04.186815 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" podStartSLOduration=5.56794592 podStartE2EDuration="17.186792931s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.52306229 +0000 UTC m=+929.471360334" lastFinishedPulling="2025-09-29 18:59:01.141909301 +0000 UTC m=+941.090207345" observedRunningTime="2025-09-29 18:59:04.184588158 +0000 UTC m=+944.132886202" watchObservedRunningTime="2025-09-29 18:59:04.186792931 +0000 UTC m=+944.135090975" Sep 29 18:59:06 crc kubenswrapper[4780]: I0929 18:59:06.015867 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" event={"ID":"e6865432-79dd-4823-a42d-bb08417a0f90","Type":"ContainerStarted","Data":"2b354b509c3f83d96be5f66a7754e6a41581ecff5162d064049e4cb5093d8fda"} Sep 29 18:59:06 crc kubenswrapper[4780]: I0929 18:59:06.017211 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" Sep 29 18:59:06 crc kubenswrapper[4780]: I0929 18:59:06.018862 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" event={"ID":"8fceb492-ba01-4e2f-b59b-6557da4e851a","Type":"ContainerStarted","Data":"e425d189273452ec5de1f2b7d647356890ef0fdedecb25b4da7b5451714d4630"} Sep 29 18:59:06 crc kubenswrapper[4780]: I0929 18:59:06.023023 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6b96467f46-lfnp5" Sep 29 18:59:06 crc kubenswrapper[4780]: I0929 18:59:06.023521 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-d8s8d" Sep 29 18:59:06 crc kubenswrapper[4780]: I0929 18:59:06.023731 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-657c6b68c7-625qh" Sep 29 18:59:06 crc kubenswrapper[4780]: I0929 18:59:06.046778 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" podStartSLOduration=2.939756148 podStartE2EDuration="19.046750213s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.532243193 +0000 UTC m=+929.480541237" lastFinishedPulling="2025-09-29 18:59:05.639237228 +0000 UTC m=+945.587535302" observedRunningTime="2025-09-29 18:59:06.04420097 +0000 UTC m=+945.992499024" watchObservedRunningTime="2025-09-29 18:59:06.046750213 +0000 UTC m=+945.995048257" Sep 29 18:59:06 crc kubenswrapper[4780]: I0929 18:59:06.118245 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" podStartSLOduration=3.817410671 podStartE2EDuration="19.118219219s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.545478401 +0000 UTC m=+929.493776445" lastFinishedPulling="2025-09-29 18:59:04.846286949 +0000 UTC m=+944.794584993" observedRunningTime="2025-09-29 18:59:06.111073394 +0000 UTC m=+946.059371438" watchObservedRunningTime="2025-09-29 18:59:06.118219219 +0000 UTC m=+946.066517263" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.035254 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" event={"ID":"cec435bb-5818-41aa-8177-dfdddc267c00","Type":"ContainerStarted","Data":"fe08760be847ab5f109da3072919d6811dc02115eeee39be8cc2c2c6c27806b3"} Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.036228 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.039366 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-qm8gn" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.064938 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-n9smm" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.069034 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" podStartSLOduration=2.825315362 podStartE2EDuration="20.068990314s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.528686741 +0000 UTC m=+929.476984775" lastFinishedPulling="2025-09-29 18:59:06.772361683 +0000 UTC m=+946.720659727" observedRunningTime="2025-09-29 18:59:07.05593185 +0000 UTC m=+947.004229894" watchObservedRunningTime="2025-09-29 18:59:07.068990314 +0000 UTC m=+947.017288358" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.094444 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-xhck5" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.236191 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-r4g5l" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.269526 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-f2xqf" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.374517 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-7ktf6" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.501980 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-lssxn" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.502081 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f589bc7f7-zfhtk" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.524251 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-slhwp" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.633934 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79f9fc9fd8-7mc4n" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.733890 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6fb7d6b8bf-dfxwb" Sep 29 18:59:07 crc kubenswrapper[4780]: I0929 18:59:07.934950 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6bb97fcf96-szxrn" Sep 29 18:59:08 crc kubenswrapper[4780]: I0929 18:59:08.266897 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" Sep 29 18:59:10 crc kubenswrapper[4780]: I0929 18:59:10.059464 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" event={"ID":"cd74a7d5-36fa-4c53-b9a6-9f9a733791d5","Type":"ContainerStarted","Data":"408fc0fbf2c785b2a861d15a077a979c7dd467c6ee384d7167910ac93a0d0ebb"} Sep 29 18:59:10 crc kubenswrapper[4780]: I0929 18:59:10.060266 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:59:10 crc kubenswrapper[4780]: I0929 18:59:10.062808 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" event={"ID":"28049dad-f386-4b21-b525-63fd463b8c37","Type":"ContainerStarted","Data":"fed5f019bd56bfe861069a7d20ed5296ef77a910ab1928fc7f3507f07fcddc30"} Sep 29 18:59:10 crc kubenswrapper[4780]: I0929 18:59:10.063001 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" Sep 29 18:59:10 crc kubenswrapper[4780]: I0929 18:59:10.064941 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" event={"ID":"dcc90bb2-08d8-448b-85bb-955bfc3a7371","Type":"ContainerStarted","Data":"3fab6bc4e7ba640cab091cb077ffbb02456ae8f0fd41b0a64f402864c7dc94a4"} Sep 29 18:59:10 crc kubenswrapper[4780]: I0929 18:59:10.065276 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:59:10 crc kubenswrapper[4780]: I0929 18:59:10.078542 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" podStartSLOduration=4.606821752 podStartE2EDuration="24.078520522s" podCreationTimestamp="2025-09-29 18:58:46 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.549406784 +0000 UTC m=+929.497704828" lastFinishedPulling="2025-09-29 18:59:09.021105554 +0000 UTC m=+948.969403598" observedRunningTime="2025-09-29 18:59:10.078264594 +0000 UTC m=+950.026562658" watchObservedRunningTime="2025-09-29 18:59:10.078520522 +0000 UTC m=+950.026818566" Sep 29 18:59:10 crc kubenswrapper[4780]: I0929 18:59:10.111178 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" podStartSLOduration=3.6185423070000002 podStartE2EDuration="23.111156956s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.53635704 +0000 UTC m=+929.484655084" lastFinishedPulling="2025-09-29 18:59:09.028971689 +0000 UTC m=+948.977269733" observedRunningTime="2025-09-29 18:59:10.108579002 +0000 UTC m=+950.056877046" watchObservedRunningTime="2025-09-29 18:59:10.111156956 +0000 UTC m=+950.059455000" Sep 29 18:59:10 crc kubenswrapper[4780]: I0929 18:59:10.137531 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" podStartSLOduration=3.655058452 podStartE2EDuration="23.13750705s" podCreationTimestamp="2025-09-29 18:58:47 +0000 UTC" firstStartedPulling="2025-09-29 18:58:49.545840172 +0000 UTC m=+929.494138226" lastFinishedPulling="2025-09-29 18:59:09.02828878 +0000 UTC m=+948.976586824" observedRunningTime="2025-09-29 18:59:10.13328775 +0000 UTC m=+950.081585794" watchObservedRunningTime="2025-09-29 18:59:10.13750705 +0000 UTC m=+950.085805094" Sep 29 18:59:17 crc kubenswrapper[4780]: I0929 18:59:17.550081 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-46wcs" Sep 29 18:59:17 crc kubenswrapper[4780]: I0929 18:59:17.856560 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-87km7" Sep 29 18:59:18 crc kubenswrapper[4780]: I0929 18:59:18.040896 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-nznlf" Sep 29 18:59:18 crc kubenswrapper[4780]: I0929 18:59:18.269801 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-75756dd4d9-ksn7s" Sep 29 18:59:18 crc kubenswrapper[4780]: I0929 18:59:18.425949 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh" Sep 29 18:59:18 crc kubenswrapper[4780]: I0929 18:59:18.943014 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d9c7d9477-jzhjc" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.003213 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-zwjml"] Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.004995 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.009703 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.010169 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.011485 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rp28x" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.016719 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.040136 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-zwjml"] Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.059778 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5hmp\" (UniqueName: \"kubernetes.io/projected/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-kube-api-access-c5hmp\") pod \"dnsmasq-dns-b8b69cf79-zwjml\" (UID: \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\") " pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.059859 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-config\") pod \"dnsmasq-dns-b8b69cf79-zwjml\" (UID: \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\") " pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.124157 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-8lkhp"] Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.125779 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.129829 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.145794 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-8lkhp"] Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.164819 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-dns-svc\") pod \"dnsmasq-dns-d5f6f49c7-8lkhp\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.164889 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5hmp\" (UniqueName: \"kubernetes.io/projected/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-kube-api-access-c5hmp\") pod \"dnsmasq-dns-b8b69cf79-zwjml\" (UID: \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\") " pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.164928 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-config\") pod \"dnsmasq-dns-b8b69cf79-zwjml\" (UID: \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\") " pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.164972 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkh8w\" (UniqueName: \"kubernetes.io/projected/9cec32a4-6b30-478c-b4cf-03172838668f-kube-api-access-nkh8w\") pod \"dnsmasq-dns-d5f6f49c7-8lkhp\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.164997 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-config\") pod \"dnsmasq-dns-d5f6f49c7-8lkhp\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.166156 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-config\") pod \"dnsmasq-dns-b8b69cf79-zwjml\" (UID: \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\") " pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.195983 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5hmp\" (UniqueName: \"kubernetes.io/projected/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-kube-api-access-c5hmp\") pod \"dnsmasq-dns-b8b69cf79-zwjml\" (UID: \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\") " pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.265610 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkh8w\" (UniqueName: \"kubernetes.io/projected/9cec32a4-6b30-478c-b4cf-03172838668f-kube-api-access-nkh8w\") pod \"dnsmasq-dns-d5f6f49c7-8lkhp\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.265694 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-config\") pod \"dnsmasq-dns-d5f6f49c7-8lkhp\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.265746 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-dns-svc\") pod \"dnsmasq-dns-d5f6f49c7-8lkhp\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.266960 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-dns-svc\") pod \"dnsmasq-dns-d5f6f49c7-8lkhp\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.267007 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-config\") pod \"dnsmasq-dns-d5f6f49c7-8lkhp\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.325196 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.329929 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkh8w\" (UniqueName: \"kubernetes.io/projected/9cec32a4-6b30-478c-b4cf-03172838668f-kube-api-access-nkh8w\") pod \"dnsmasq-dns-d5f6f49c7-8lkhp\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.441971 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.679169 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-8lkhp"] Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.683996 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 18:59:35 crc kubenswrapper[4780]: I0929 18:59:35.761034 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-zwjml"] Sep 29 18:59:35 crc kubenswrapper[4780]: W0929 18:59:35.767875 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb7fec7e_320e_44c6_b1d4_b7f73a64c356.slice/crio-cca38b0323b4369583c08a2274697e14b836a15d58088144dbdd4e30f7c41c05 WatchSource:0}: Error finding container cca38b0323b4369583c08a2274697e14b836a15d58088144dbdd4e30f7c41c05: Status 404 returned error can't find the container with id cca38b0323b4369583c08a2274697e14b836a15d58088144dbdd4e30f7c41c05 Sep 29 18:59:36 crc kubenswrapper[4780]: I0929 18:59:36.312093 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" event={"ID":"bb7fec7e-320e-44c6-b1d4-b7f73a64c356","Type":"ContainerStarted","Data":"cca38b0323b4369583c08a2274697e14b836a15d58088144dbdd4e30f7c41c05"} Sep 29 18:59:36 crc kubenswrapper[4780]: I0929 18:59:36.313537 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" event={"ID":"9cec32a4-6b30-478c-b4cf-03172838668f","Type":"ContainerStarted","Data":"dce3093f97a65fd5f0c4697a321644983525f2a321f4fd2020667336bcab5879"} Sep 29 18:59:36 crc kubenswrapper[4780]: I0929 18:59:36.894123 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-zwjml"] Sep 29 18:59:36 crc kubenswrapper[4780]: I0929 18:59:36.928179 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-g4chk"] Sep 29 18:59:36 crc kubenswrapper[4780]: I0929 18:59:36.929905 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:36 crc kubenswrapper[4780]: I0929 18:59:36.937953 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-g4chk"] Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.011520 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxwx\" (UniqueName: \"kubernetes.io/projected/57410855-4dd7-4552-8826-d127039e27a4-kube-api-access-zrxwx\") pod \"dnsmasq-dns-b6f94bdfc-g4chk\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.011591 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-dns-svc\") pod \"dnsmasq-dns-b6f94bdfc-g4chk\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.011646 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-config\") pod \"dnsmasq-dns-b6f94bdfc-g4chk\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.118297 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-config\") pod \"dnsmasq-dns-b6f94bdfc-g4chk\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.118423 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxwx\" (UniqueName: \"kubernetes.io/projected/57410855-4dd7-4552-8826-d127039e27a4-kube-api-access-zrxwx\") pod \"dnsmasq-dns-b6f94bdfc-g4chk\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.118477 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-dns-svc\") pod \"dnsmasq-dns-b6f94bdfc-g4chk\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.120169 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-config\") pod \"dnsmasq-dns-b6f94bdfc-g4chk\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.120608 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-dns-svc\") pod \"dnsmasq-dns-b6f94bdfc-g4chk\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.149297 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxwx\" (UniqueName: \"kubernetes.io/projected/57410855-4dd7-4552-8826-d127039e27a4-kube-api-access-zrxwx\") pod \"dnsmasq-dns-b6f94bdfc-g4chk\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.269922 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.399127 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-8lkhp"] Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.432771 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-dhj4n"] Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.434298 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.468670 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-dhj4n"] Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.629417 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-dns-svc\") pod \"dnsmasq-dns-77795d58f5-dhj4n\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.629502 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98mzv\" (UniqueName: \"kubernetes.io/projected/879d0631-a279-4337-9540-b76028b54fbc-kube-api-access-98mzv\") pod \"dnsmasq-dns-77795d58f5-dhj4n\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.629542 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-config\") pod \"dnsmasq-dns-77795d58f5-dhj4n\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.737488 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-dns-svc\") pod \"dnsmasq-dns-77795d58f5-dhj4n\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.737596 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98mzv\" (UniqueName: \"kubernetes.io/projected/879d0631-a279-4337-9540-b76028b54fbc-kube-api-access-98mzv\") pod \"dnsmasq-dns-77795d58f5-dhj4n\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.737644 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-config\") pod \"dnsmasq-dns-77795d58f5-dhj4n\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.739239 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-config\") pod \"dnsmasq-dns-77795d58f5-dhj4n\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.739900 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-dns-svc\") pod \"dnsmasq-dns-77795d58f5-dhj4n\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:37 crc kubenswrapper[4780]: I0929 18:59:37.774789 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98mzv\" (UniqueName: \"kubernetes.io/projected/879d0631-a279-4337-9540-b76028b54fbc-kube-api-access-98mzv\") pod \"dnsmasq-dns-77795d58f5-dhj4n\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.000786 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-g4chk"] Sep 29 18:59:38 crc kubenswrapper[4780]: W0929 18:59:38.012074 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57410855_4dd7_4552_8826_d127039e27a4.slice/crio-1932913b142923670f13248e57e239b74238476c37d99b3e0faf02ac5cdfcf18 WatchSource:0}: Error finding container 1932913b142923670f13248e57e239b74238476c37d99b3e0faf02ac5cdfcf18: Status 404 returned error can't find the container with id 1932913b142923670f13248e57e239b74238476c37d99b3e0faf02ac5cdfcf18 Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.057709 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.103654 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.106671 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.114026 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.114148 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.114294 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.114323 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.114726 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.114896 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-klx89" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.114996 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.120913 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.245007 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.245464 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.245500 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.245931 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.245973 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.246006 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.246035 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ee2741-9417-4698-b550-7c596d00d271-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.246121 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.246171 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrc8\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-kube-api-access-qwrc8\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.246206 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.246300 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ee2741-9417-4698-b550-7c596d00d271-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.346637 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-dhj4n"] Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347654 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347717 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrc8\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-kube-api-access-qwrc8\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347742 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347773 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ee2741-9417-4698-b550-7c596d00d271-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347872 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347905 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347929 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347943 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347962 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.347984 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.348007 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ee2741-9417-4698-b550-7c596d00d271-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.349536 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.349897 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.350215 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.351369 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.354304 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.355018 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.357786 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ee2741-9417-4698-b550-7c596d00d271-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.359853 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" event={"ID":"57410855-4dd7-4552-8826-d127039e27a4","Type":"ContainerStarted","Data":"1932913b142923670f13248e57e239b74238476c37d99b3e0faf02ac5cdfcf18"} Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.360333 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.360585 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.363224 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ee2741-9417-4698-b550-7c596d00d271-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.379735 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrc8\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-kube-api-access-qwrc8\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.411908 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.437834 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.579816 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.583149 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.598760 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.598833 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.599217 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.599372 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-n6ppc" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.599461 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.599574 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.607670 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.613770 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.754841 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.754895 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.754925 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b90472c3-a09d-433c-922b-d164a11636e6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.754940 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.754970 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.755037 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.755071 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mkcg\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-kube-api-access-9mkcg\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.755099 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.755117 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.755144 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.755765 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b90472c3-a09d-433c-922b-d164a11636e6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858084 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858561 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858613 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858640 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b90472c3-a09d-433c-922b-d164a11636e6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858681 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858723 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858750 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mkcg\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-kube-api-access-9mkcg\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858776 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858806 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858844 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.858927 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b90472c3-a09d-433c-922b-d164a11636e6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.859640 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.860090 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.860301 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.860447 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.860715 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.862201 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.865185 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.865390 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b90472c3-a09d-433c-922b-d164a11636e6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.866555 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b90472c3-a09d-433c-922b-d164a11636e6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.869299 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.882283 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mkcg\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-kube-api-access-9mkcg\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.884479 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:38 crc kubenswrapper[4780]: I0929 18:59:38.915999 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 18:59:39 crc kubenswrapper[4780]: W0929 18:59:39.005260 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2ee2741_9417_4698_b550_7c596d00d271.slice/crio-8c1f691cc3507568b334ffc02e35c0ed5e3ed1451e483fdb1eb306977a343ce1 WatchSource:0}: Error finding container 8c1f691cc3507568b334ffc02e35c0ed5e3ed1451e483fdb1eb306977a343ce1: Status 404 returned error can't find the container with id 8c1f691cc3507568b334ffc02e35c0ed5e3ed1451e483fdb1eb306977a343ce1 Sep 29 18:59:39 crc kubenswrapper[4780]: I0929 18:59:39.006381 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 18:59:39 crc kubenswrapper[4780]: I0929 18:59:39.385701 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ee2741-9417-4698-b550-7c596d00d271","Type":"ContainerStarted","Data":"8c1f691cc3507568b334ffc02e35c0ed5e3ed1451e483fdb1eb306977a343ce1"} Sep 29 18:59:39 crc kubenswrapper[4780]: I0929 18:59:39.389321 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" event={"ID":"879d0631-a279-4337-9540-b76028b54fbc","Type":"ContainerStarted","Data":"ff63ac8e55c1c62d692378bb70358dad3652603b7fb99885d01f4e1b48380dc2"} Sep 29 18:59:39 crc kubenswrapper[4780]: I0929 18:59:39.484290 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 18:59:39 crc kubenswrapper[4780]: W0929 18:59:39.501755 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90472c3_a09d_433c_922b_d164a11636e6.slice/crio-4eba6f78ef2f0e7a082b6b36ad455efbe29837115d31d76feb7d9fa569257107 WatchSource:0}: Error finding container 4eba6f78ef2f0e7a082b6b36ad455efbe29837115d31d76feb7d9fa569257107: Status 404 returned error can't find the container with id 4eba6f78ef2f0e7a082b6b36ad455efbe29837115d31d76feb7d9fa569257107 Sep 29 18:59:40 crc kubenswrapper[4780]: I0929 18:59:40.414869 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b90472c3-a09d-433c-922b-d164a11636e6","Type":"ContainerStarted","Data":"4eba6f78ef2f0e7a082b6b36ad455efbe29837115d31d76feb7d9fa569257107"} Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.609518 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.611132 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.616172 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-tl88q" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.616287 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.616364 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.616485 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.616530 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.629216 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.633678 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.693347 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.695885 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.701910 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-jhnc7" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.702421 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.704337 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.704690 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.711534 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.720634 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qpg4\" (UniqueName: \"kubernetes.io/projected/48191511-38e9-46d2-82f8-77453769927c-kube-api-access-6qpg4\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.720699 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.720734 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.720750 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/48191511-38e9-46d2-82f8-77453769927c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.720782 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-kolla-config\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.720807 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.720827 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-secrets\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.720848 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.720870 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-config-data-default\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.822699 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-config-data-default\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.822792 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cqxn\" (UniqueName: \"kubernetes.io/projected/628b549e-6d99-43d4-94bb-61b457f4c37b-kube-api-access-4cqxn\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.822838 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.822874 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.822896 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qpg4\" (UniqueName: \"kubernetes.io/projected/48191511-38e9-46d2-82f8-77453769927c-kube-api-access-6qpg4\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.822968 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.823034 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824110 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824186 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824232 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/48191511-38e9-46d2-82f8-77453769927c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824302 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824341 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-kolla-config\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824389 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824428 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824462 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824501 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824549 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-secrets\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.824588 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.825141 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.826157 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-config-data-default\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.829898 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.834515 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/48191511-38e9-46d2-82f8-77453769927c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.836563 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-kolla-config\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.854994 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.861092 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qpg4\" (UniqueName: \"kubernetes.io/projected/48191511-38e9-46d2-82f8-77453769927c-kube-api-access-6qpg4\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.861639 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.862065 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.883790 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-secrets\") pod \"openstack-galera-0\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.928616 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.928687 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.928715 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.928790 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cqxn\" (UniqueName: \"kubernetes.io/projected/628b549e-6d99-43d4-94bb-61b457f4c37b-kube-api-access-4cqxn\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.928811 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.928877 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.928903 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.928926 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.928966 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.932000 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.932287 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.932339 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.933379 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.935497 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.938157 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.939194 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.944898 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.954558 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.959630 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.960577 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.962031 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.962356 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gw9gf" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.962543 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.983088 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cqxn\" (UniqueName: \"kubernetes.io/projected/628b549e-6d99-43d4-94bb-61b457f4c37b-kube-api-access-4cqxn\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:41 crc kubenswrapper[4780]: I0929 18:59:41.992137 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.008244 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.054863 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.133424 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-combined-ca-bundle\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.133527 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-config-data\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.133608 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-memcached-tls-certs\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.134018 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-kolla-config\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.134178 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8gs\" (UniqueName: \"kubernetes.io/projected/58ef0b7e-a06d-49a2-824e-9f088c267a97-kube-api-access-tw8gs\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.239519 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-combined-ca-bundle\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.239610 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-config-data\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.239658 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-memcached-tls-certs\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.239697 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-kolla-config\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.239750 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8gs\" (UniqueName: \"kubernetes.io/projected/58ef0b7e-a06d-49a2-824e-9f088c267a97-kube-api-access-tw8gs\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.241062 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-kolla-config\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.241675 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-config-data\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.244193 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-memcached-tls-certs\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.248712 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-combined-ca-bundle\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.261241 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8gs\" (UniqueName: \"kubernetes.io/projected/58ef0b7e-a06d-49a2-824e-9f088c267a97-kube-api-access-tw8gs\") pod \"memcached-0\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " pod="openstack/memcached-0" Sep 29 18:59:42 crc kubenswrapper[4780]: I0929 18:59:42.322731 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 18:59:43 crc kubenswrapper[4780]: I0929 18:59:43.948755 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 18:59:43 crc kubenswrapper[4780]: I0929 18:59:43.951895 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 18:59:43 crc kubenswrapper[4780]: I0929 18:59:43.954181 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-n6mbr" Sep 29 18:59:43 crc kubenswrapper[4780]: I0929 18:59:43.962033 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 18:59:44 crc kubenswrapper[4780]: I0929 18:59:44.105209 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k5gl\" (UniqueName: \"kubernetes.io/projected/ad923606-d2ba-467f-b983-c8e77c9f6cc3-kube-api-access-8k5gl\") pod \"kube-state-metrics-0\" (UID: \"ad923606-d2ba-467f-b983-c8e77c9f6cc3\") " pod="openstack/kube-state-metrics-0" Sep 29 18:59:44 crc kubenswrapper[4780]: I0929 18:59:44.207227 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k5gl\" (UniqueName: \"kubernetes.io/projected/ad923606-d2ba-467f-b983-c8e77c9f6cc3-kube-api-access-8k5gl\") pod \"kube-state-metrics-0\" (UID: \"ad923606-d2ba-467f-b983-c8e77c9f6cc3\") " pod="openstack/kube-state-metrics-0" Sep 29 18:59:44 crc kubenswrapper[4780]: I0929 18:59:44.247819 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k5gl\" (UniqueName: \"kubernetes.io/projected/ad923606-d2ba-467f-b983-c8e77c9f6cc3-kube-api-access-8k5gl\") pod \"kube-state-metrics-0\" (UID: \"ad923606-d2ba-467f-b983-c8e77c9f6cc3\") " pod="openstack/kube-state-metrics-0" Sep 29 18:59:44 crc kubenswrapper[4780]: I0929 18:59:44.319473 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.580165 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.582621 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.586537 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.586772 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8frbf" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.586831 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.586848 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.586862 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.596022 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.692891 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.693307 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.693376 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76bh8\" (UniqueName: \"kubernetes.io/projected/8611dff0-9ad1-4bba-b687-958d7e887859-kube-api-access-76bh8\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.693407 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.693448 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.693469 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.693825 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.693859 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-config\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.795828 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.795889 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.795919 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76bh8\" (UniqueName: \"kubernetes.io/projected/8611dff0-9ad1-4bba-b687-958d7e887859-kube-api-access-76bh8\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.796476 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.796516 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.796557 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.796693 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.796738 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.796760 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-config\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.798161 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-config\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.799036 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.800029 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.804356 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.805881 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.814790 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.819020 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.819321 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76bh8\" (UniqueName: \"kubernetes.io/projected/8611dff0-9ad1-4bba-b687-958d7e887859-kube-api-access-76bh8\") pod \"ovsdbserver-nb-0\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:47 crc kubenswrapper[4780]: I0929 18:59:47.912104 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.277238 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hzb5x"] Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.278464 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.287617 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.287719 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rh4xb" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.291522 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.296399 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hzb5x"] Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.394787 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tqkx6"] Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.397663 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.408774 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.408908 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-combined-ca-bundle\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.408970 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-log-ovn\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.409005 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-ovn-controller-tls-certs\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.409031 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc8z8\" (UniqueName: \"kubernetes.io/projected/91a8fa86-9475-490a-9c9f-09233413eab5-kube-api-access-kc8z8\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.409072 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91a8fa86-9475-490a-9c9f-09233413eab5-scripts\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.409100 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run-ovn\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.430939 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tqkx6"] Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511438 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-lib\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511513 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78swl\" (UniqueName: \"kubernetes.io/projected/3c91af49-2adc-47a1-892c-82da3b338492-kube-api-access-78swl\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511552 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c91af49-2adc-47a1-892c-82da3b338492-scripts\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511616 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-etc-ovs\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511651 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run-ovn\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511713 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511775 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-log\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511829 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-log-ovn\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511861 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-ovn-controller-tls-certs\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511883 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc8z8\" (UniqueName: \"kubernetes.io/projected/91a8fa86-9475-490a-9c9f-09233413eab5-kube-api-access-kc8z8\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511907 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91a8fa86-9475-490a-9c9f-09233413eab5-scripts\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511948 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-run\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.511992 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-combined-ca-bundle\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.512381 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run-ovn\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.512513 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.512982 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-log-ovn\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.515020 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91a8fa86-9475-490a-9c9f-09233413eab5-scripts\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.520861 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-combined-ca-bundle\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.533994 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc8z8\" (UniqueName: \"kubernetes.io/projected/91a8fa86-9475-490a-9c9f-09233413eab5-kube-api-access-kc8z8\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.533388 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-ovn-controller-tls-certs\") pod \"ovn-controller-hzb5x\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.606720 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.613831 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-run\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.614003 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-run\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.614439 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-lib\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.614565 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78swl\" (UniqueName: \"kubernetes.io/projected/3c91af49-2adc-47a1-892c-82da3b338492-kube-api-access-78swl\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.614599 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c91af49-2adc-47a1-892c-82da3b338492-scripts\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.614642 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-etc-ovs\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.614718 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-lib\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.614724 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-log\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.614853 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-log\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.615011 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-etc-ovs\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.617477 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c91af49-2adc-47a1-892c-82da3b338492-scripts\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.639560 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78swl\" (UniqueName: \"kubernetes.io/projected/3c91af49-2adc-47a1-892c-82da3b338492-kube-api-access-78swl\") pod \"ovn-controller-ovs-tqkx6\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:48 crc kubenswrapper[4780]: I0929 18:59:48.731411 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.290925 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.293268 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.302312 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-zbjq9" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.303171 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.303613 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.307164 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.307520 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.466354 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.466420 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-config\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.466448 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.466534 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.466577 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.466692 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdxqq\" (UniqueName: \"kubernetes.io/projected/62b9c388-0f74-42fc-bf3d-711322b976d8-kube-api-access-wdxqq\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.466740 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.466767 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.570638 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.570723 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-config\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.570755 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.570778 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.570810 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.570889 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdxqq\" (UniqueName: \"kubernetes.io/projected/62b9c388-0f74-42fc-bf3d-711322b976d8-kube-api-access-wdxqq\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.570923 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.570955 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.571568 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.571890 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.572171 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-config\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.572788 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.576702 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.588918 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.590663 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.603443 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.610771 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdxqq\" (UniqueName: \"kubernetes.io/projected/62b9c388-0f74-42fc-bf3d-711322b976d8-kube-api-access-wdxqq\") pod \"ovsdbserver-sb-0\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:51 crc kubenswrapper[4780]: I0929 18:59:51.623517 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 18:59:54 crc kubenswrapper[4780]: E0929 18:59:54.802844 4780 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b" Sep 29 18:59:54 crc kubenswrapper[4780]: E0929 18:59:54.804197 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkh8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-d5f6f49c7-8lkhp_openstack(9cec32a4-6b30-478c-b4cf-03172838668f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 18:59:54 crc kubenswrapper[4780]: E0929 18:59:54.805646 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" podUID="9cec32a4-6b30-478c-b4cf-03172838668f" Sep 29 18:59:54 crc kubenswrapper[4780]: E0929 18:59:54.886813 4780 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b" Sep 29 18:59:54 crc kubenswrapper[4780]: E0929 18:59:54.887272 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98mzv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-77795d58f5-dhj4n_openstack(879d0631-a279-4337-9540-b76028b54fbc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 18:59:54 crc kubenswrapper[4780]: E0929 18:59:54.888835 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" podUID="879d0631-a279-4337-9540-b76028b54fbc" Sep 29 18:59:54 crc kubenswrapper[4780]: E0929 18:59:54.908564 4780 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b" Sep 29 18:59:54 crc kubenswrapper[4780]: E0929 18:59:54.908809 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c5hmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b8b69cf79-zwjml_openstack(bb7fec7e-320e-44c6-b1d4-b7f73a64c356): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 18:59:54 crc kubenswrapper[4780]: E0929 18:59:54.910059 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" podUID="bb7fec7e-320e-44c6-b1d4-b7f73a64c356" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.601323 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b\\\"\"" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" podUID="879d0631-a279-4337-9540-b76028b54fbc" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.680899 4780 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.681200 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mkcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(b90472c3-a09d-433c-922b-d164a11636e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.682616 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b90472c3-a09d-433c-922b-d164a11636e6" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.709174 4780 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.709413 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zrxwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b6f94bdfc-g4chk_openstack(57410855-4dd7-4552-8826-d127039e27a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.709494 4780 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.709740 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwrc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(d2ee2741-9417-4698-b550-7c596d00d271): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.719288 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="d2ee2741-9417-4698-b550-7c596d00d271" Sep 29 18:59:55 crc kubenswrapper[4780]: E0929 18:59:55.719350 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" podUID="57410855-4dd7-4552-8826-d127039e27a4" Sep 29 18:59:55 crc kubenswrapper[4780]: I0929 18:59:55.906847 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:55 crc kubenswrapper[4780]: I0929 18:59:55.960387 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5hmp\" (UniqueName: \"kubernetes.io/projected/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-kube-api-access-c5hmp\") pod \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\" (UID: \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\") " Sep 29 18:59:55 crc kubenswrapper[4780]: I0929 18:59:55.960471 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-config\") pod \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\" (UID: \"bb7fec7e-320e-44c6-b1d4-b7f73a64c356\") " Sep 29 18:59:55 crc kubenswrapper[4780]: I0929 18:59:55.962307 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-config" (OuterVolumeSpecName: "config") pod "bb7fec7e-320e-44c6-b1d4-b7f73a64c356" (UID: "bb7fec7e-320e-44c6-b1d4-b7f73a64c356"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:59:55 crc kubenswrapper[4780]: I0929 18:59:55.972629 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-kube-api-access-c5hmp" (OuterVolumeSpecName: "kube-api-access-c5hmp") pod "bb7fec7e-320e-44c6-b1d4-b7f73a64c356" (UID: "bb7fec7e-320e-44c6-b1d4-b7f73a64c356"). InnerVolumeSpecName "kube-api-access-c5hmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.063630 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5hmp\" (UniqueName: \"kubernetes.io/projected/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-kube-api-access-c5hmp\") on node \"crc\" DevicePath \"\"" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.063697 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7fec7e-320e-44c6-b1d4-b7f73a64c356-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.085978 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.165144 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkh8w\" (UniqueName: \"kubernetes.io/projected/9cec32a4-6b30-478c-b4cf-03172838668f-kube-api-access-nkh8w\") pod \"9cec32a4-6b30-478c-b4cf-03172838668f\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.165265 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-dns-svc\") pod \"9cec32a4-6b30-478c-b4cf-03172838668f\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.165335 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-config\") pod \"9cec32a4-6b30-478c-b4cf-03172838668f\" (UID: \"9cec32a4-6b30-478c-b4cf-03172838668f\") " Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.165824 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-config" (OuterVolumeSpecName: "config") pod "9cec32a4-6b30-478c-b4cf-03172838668f" (UID: "9cec32a4-6b30-478c-b4cf-03172838668f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.165823 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9cec32a4-6b30-478c-b4cf-03172838668f" (UID: "9cec32a4-6b30-478c-b4cf-03172838668f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.170654 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cec32a4-6b30-478c-b4cf-03172838668f-kube-api-access-nkh8w" (OuterVolumeSpecName: "kube-api-access-nkh8w") pod "9cec32a4-6b30-478c-b4cf-03172838668f" (UID: "9cec32a4-6b30-478c-b4cf-03172838668f"). InnerVolumeSpecName "kube-api-access-nkh8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.274010 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.274237 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cec32a4-6b30-478c-b4cf-03172838668f-config\") on node \"crc\" DevicePath \"\"" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.274265 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkh8w\" (UniqueName: \"kubernetes.io/projected/9cec32a4-6b30-478c-b4cf-03172838668f-kube-api-access-nkh8w\") on node \"crc\" DevicePath \"\"" Sep 29 18:59:56 crc kubenswrapper[4780]: W0929 18:59:56.371426 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad923606_d2ba_467f_b983_c8e77c9f6cc3.slice/crio-d2a560e1a3f6c105554d98ae9125dbd34074025eee08355f7f8826c66eb4b4a0 WatchSource:0}: Error finding container d2a560e1a3f6c105554d98ae9125dbd34074025eee08355f7f8826c66eb4b4a0: Status 404 returned error can't find the container with id d2a560e1a3f6c105554d98ae9125dbd34074025eee08355f7f8826c66eb4b4a0 Sep 29 18:59:56 crc kubenswrapper[4780]: W0929 18:59:56.371938 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48191511_38e9_46d2_82f8_77453769927c.slice/crio-3d2e10a927a85365dd4b2fa87dcee5ec79b99f946e90d4eb60c6b829653af6ee WatchSource:0}: Error finding container 3d2e10a927a85365dd4b2fa87dcee5ec79b99f946e90d4eb60c6b829653af6ee: Status 404 returned error can't find the container with id 3d2e10a927a85365dd4b2fa87dcee5ec79b99f946e90d4eb60c6b829653af6ee Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.372760 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.386217 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.490851 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 29 18:59:56 crc kubenswrapper[4780]: W0929 18:59:56.497717 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58ef0b7e_a06d_49a2_824e_9f088c267a97.slice/crio-1fbbeffc98f3d9d74e5977309189337422cb12fa45b0b3ab6d4f24ef62680337 WatchSource:0}: Error finding container 1fbbeffc98f3d9d74e5977309189337422cb12fa45b0b3ab6d4f24ef62680337: Status 404 returned error can't find the container with id 1fbbeffc98f3d9d74e5977309189337422cb12fa45b0b3ab6d4f24ef62680337 Sep 29 18:59:56 crc kubenswrapper[4780]: W0929 18:59:56.527122 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8611dff0_9ad1_4bba_b687_958d7e887859.slice/crio-9fff2f32a4c1ab9ea09b9e77c1a8faaab8450d76986c65c67904cfe2197ba50e WatchSource:0}: Error finding container 9fff2f32a4c1ab9ea09b9e77c1a8faaab8450d76986c65c67904cfe2197ba50e: Status 404 returned error can't find the container with id 9fff2f32a4c1ab9ea09b9e77c1a8faaab8450d76986c65c67904cfe2197ba50e Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.527504 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.575248 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hzb5x"] Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.606016 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 18:59:56 crc kubenswrapper[4780]: W0929 18:59:56.611273 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod628b549e_6d99_43d4_94bb_61b457f4c37b.slice/crio-aa9d73c831e1295dcb9e12df8b4e963b68ccc3a2e1d42a671190107f6e82a599 WatchSource:0}: Error finding container aa9d73c831e1295dcb9e12df8b4e963b68ccc3a2e1d42a671190107f6e82a599: Status 404 returned error can't find the container with id aa9d73c831e1295dcb9e12df8b4e963b68ccc3a2e1d42a671190107f6e82a599 Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.612779 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" event={"ID":"9cec32a4-6b30-478c-b4cf-03172838668f","Type":"ContainerDied","Data":"dce3093f97a65fd5f0c4697a321644983525f2a321f4fd2020667336bcab5879"} Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.612817 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5f6f49c7-8lkhp" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.620357 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"58ef0b7e-a06d-49a2-824e-9f088c267a97","Type":"ContainerStarted","Data":"1fbbeffc98f3d9d74e5977309189337422cb12fa45b0b3ab6d4f24ef62680337"} Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.627843 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8611dff0-9ad1-4bba-b687-958d7e887859","Type":"ContainerStarted","Data":"9fff2f32a4c1ab9ea09b9e77c1a8faaab8450d76986c65c67904cfe2197ba50e"} Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.629567 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48191511-38e9-46d2-82f8-77453769927c","Type":"ContainerStarted","Data":"3d2e10a927a85365dd4b2fa87dcee5ec79b99f946e90d4eb60c6b829653af6ee"} Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.630623 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" event={"ID":"bb7fec7e-320e-44c6-b1d4-b7f73a64c356","Type":"ContainerDied","Data":"cca38b0323b4369583c08a2274697e14b836a15d58088144dbdd4e30f7c41c05"} Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.630664 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8b69cf79-zwjml" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.631730 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad923606-d2ba-467f-b983-c8e77c9f6cc3","Type":"ContainerStarted","Data":"d2a560e1a3f6c105554d98ae9125dbd34074025eee08355f7f8826c66eb4b4a0"} Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.633109 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x" event={"ID":"91a8fa86-9475-490a-9c9f-09233413eab5","Type":"ContainerStarted","Data":"be23bffc337a1f4b1dc6d57e5dda65612b10165d95d04f10642bbea383becb7f"} Sep 29 18:59:56 crc kubenswrapper[4780]: E0929 18:59:56.635159 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b90472c3-a09d-433c-922b-d164a11636e6" Sep 29 18:59:56 crc kubenswrapper[4780]: E0929 18:59:56.635615 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:37cc031749b113c35231066ce9f8ce7ccc83e21808ba92ea1981e72bbc42e80f\\\"\"" pod="openstack/rabbitmq-server-0" podUID="d2ee2741-9417-4698-b550-7c596d00d271" Sep 29 18:59:56 crc kubenswrapper[4780]: E0929 18:59:56.637651 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b\\\"\"" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" podUID="57410855-4dd7-4552-8826-d127039e27a4" Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.652668 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tqkx6"] Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.769544 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-zwjml"] Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.769583 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8b69cf79-zwjml"] Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.799713 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-8lkhp"] Sep 29 18:59:56 crc kubenswrapper[4780]: I0929 18:59:56.800769 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d5f6f49c7-8lkhp"] Sep 29 18:59:57 crc kubenswrapper[4780]: I0929 18:59:57.644675 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"628b549e-6d99-43d4-94bb-61b457f4c37b","Type":"ContainerStarted","Data":"aa9d73c831e1295dcb9e12df8b4e963b68ccc3a2e1d42a671190107f6e82a599"} Sep 29 18:59:57 crc kubenswrapper[4780]: I0929 18:59:57.647470 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqkx6" event={"ID":"3c91af49-2adc-47a1-892c-82da3b338492","Type":"ContainerStarted","Data":"1344e1d34b72f29344aad148d07f6a2a075170c1e62311fa75d1c29069ca2804"} Sep 29 18:59:57 crc kubenswrapper[4780]: I0929 18:59:57.741003 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 18:59:57 crc kubenswrapper[4780]: W0929 18:59:57.744174 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b9c388_0f74_42fc_bf3d_711322b976d8.slice/crio-849a94a4e4201ed620437ffed48bceb3e1f3ed245c6c02acd8932d69d398ccba WatchSource:0}: Error finding container 849a94a4e4201ed620437ffed48bceb3e1f3ed245c6c02acd8932d69d398ccba: Status 404 returned error can't find the container with id 849a94a4e4201ed620437ffed48bceb3e1f3ed245c6c02acd8932d69d398ccba Sep 29 18:59:58 crc kubenswrapper[4780]: I0929 18:59:58.656135 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"62b9c388-0f74-42fc-bf3d-711322b976d8","Type":"ContainerStarted","Data":"849a94a4e4201ed620437ffed48bceb3e1f3ed245c6c02acd8932d69d398ccba"} Sep 29 18:59:58 crc kubenswrapper[4780]: I0929 18:59:58.769458 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cec32a4-6b30-478c-b4cf-03172838668f" path="/var/lib/kubelet/pods/9cec32a4-6b30-478c-b4cf-03172838668f/volumes" Sep 29 18:59:58 crc kubenswrapper[4780]: I0929 18:59:58.769850 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7fec7e-320e-44c6-b1d4-b7f73a64c356" path="/var/lib/kubelet/pods/bb7fec7e-320e-44c6-b1d4-b7f73a64c356/volumes" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.140978 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw"] Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.144621 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.147157 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.150334 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.165276 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw"] Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.166586 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l24ls\" (UniqueName: \"kubernetes.io/projected/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-kube-api-access-l24ls\") pod \"collect-profiles-29319540-kj8xw\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.166650 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-config-volume\") pod \"collect-profiles-29319540-kj8xw\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.167084 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-secret-volume\") pod \"collect-profiles-29319540-kj8xw\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.268611 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l24ls\" (UniqueName: \"kubernetes.io/projected/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-kube-api-access-l24ls\") pod \"collect-profiles-29319540-kj8xw\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.268662 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-config-volume\") pod \"collect-profiles-29319540-kj8xw\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.268736 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-secret-volume\") pod \"collect-profiles-29319540-kj8xw\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.269882 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-config-volume\") pod \"collect-profiles-29319540-kj8xw\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.285775 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l24ls\" (UniqueName: \"kubernetes.io/projected/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-kube-api-access-l24ls\") pod \"collect-profiles-29319540-kj8xw\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.290739 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-secret-volume\") pod \"collect-profiles-29319540-kj8xw\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:00 crc kubenswrapper[4780]: I0929 19:00:00.476309 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:05 crc kubenswrapper[4780]: I0929 19:00:05.735861 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw"] Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.730299 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48191511-38e9-46d2-82f8-77453769927c","Type":"ContainerStarted","Data":"463b5f829fe19e14d7099fe53489e9ad00fece5bc4c41ff73db16873402d77dc"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.733846 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"628b549e-6d99-43d4-94bb-61b457f4c37b","Type":"ContainerStarted","Data":"8af73da64b8697605018768f5efc2298ee5aa5426fed89f61dfd4c0b10c58708"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.736144 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x" event={"ID":"91a8fa86-9475-490a-9c9f-09233413eab5","Type":"ContainerStarted","Data":"5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.736311 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hzb5x" Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.738554 4780 generic.go:334] "Generic (PLEG): container finished" podID="4fc73004-9d27-4d7e-8082-95d33f7b0fc7" containerID="edb2d720a111d6be8d4f3a2b4f81db83411c60a97b324de83c9883841ab42430" exitCode=0 Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.738696 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" event={"ID":"4fc73004-9d27-4d7e-8082-95d33f7b0fc7","Type":"ContainerDied","Data":"edb2d720a111d6be8d4f3a2b4f81db83411c60a97b324de83c9883841ab42430"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.738727 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" event={"ID":"4fc73004-9d27-4d7e-8082-95d33f7b0fc7","Type":"ContainerStarted","Data":"fdc9c83f57d99637abf2e138ac1b744f00b708b7ce07457821167f253d5165ae"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.740612 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"58ef0b7e-a06d-49a2-824e-9f088c267a97","Type":"ContainerStarted","Data":"fa44c2b6e56600dfb6c99d6fb0e419237762ff70fabe663a6e3f18eded510c50"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.740738 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.742610 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad923606-d2ba-467f-b983-c8e77c9f6cc3","Type":"ContainerStarted","Data":"592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.742779 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.745113 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqkx6" event={"ID":"3c91af49-2adc-47a1-892c-82da3b338492","Type":"ContainerStarted","Data":"56cd11a363afa5285113dcd494182baca0f5cd0564a4c59d2c667f8b958be968"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.747110 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"62b9c388-0f74-42fc-bf3d-711322b976d8","Type":"ContainerStarted","Data":"554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.748683 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8611dff0-9ad1-4bba-b687-958d7e887859","Type":"ContainerStarted","Data":"bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce"} Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.782592 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.180490569 podStartE2EDuration="23.782571215s" podCreationTimestamp="2025-09-29 18:59:43 +0000 UTC" firstStartedPulling="2025-09-29 18:59:56.373572601 +0000 UTC m=+996.321870645" lastFinishedPulling="2025-09-29 19:00:05.975653247 +0000 UTC m=+1005.923951291" observedRunningTime="2025-09-29 19:00:06.780992918 +0000 UTC m=+1006.729290962" watchObservedRunningTime="2025-09-29 19:00:06.782571215 +0000 UTC m=+1006.730869249" Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.900571 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.275721762 podStartE2EDuration="25.900548803s" podCreationTimestamp="2025-09-29 18:59:41 +0000 UTC" firstStartedPulling="2025-09-29 18:59:56.502612053 +0000 UTC m=+996.450910097" lastFinishedPulling="2025-09-29 19:00:05.127439094 +0000 UTC m=+1005.075737138" observedRunningTime="2025-09-29 19:00:06.899553083 +0000 UTC m=+1006.847851127" watchObservedRunningTime="2025-09-29 19:00:06.900548803 +0000 UTC m=+1006.848846847" Sep 29 19:00:06 crc kubenswrapper[4780]: I0929 19:00:06.952757 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hzb5x" podStartSLOduration=9.681952977 podStartE2EDuration="18.952732586s" podCreationTimestamp="2025-09-29 18:59:48 +0000 UTC" firstStartedPulling="2025-09-29 18:59:56.579989247 +0000 UTC m=+996.528287291" lastFinishedPulling="2025-09-29 19:00:05.850768856 +0000 UTC m=+1005.799066900" observedRunningTime="2025-09-29 19:00:06.943578947 +0000 UTC m=+1006.891876991" watchObservedRunningTime="2025-09-29 19:00:06.952732586 +0000 UTC m=+1006.901030630" Sep 29 19:00:07 crc kubenswrapper[4780]: I0929 19:00:07.759447 4780 generic.go:334] "Generic (PLEG): container finished" podID="879d0631-a279-4337-9540-b76028b54fbc" containerID="17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1" exitCode=0 Sep 29 19:00:07 crc kubenswrapper[4780]: I0929 19:00:07.759517 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" event={"ID":"879d0631-a279-4337-9540-b76028b54fbc","Type":"ContainerDied","Data":"17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1"} Sep 29 19:00:07 crc kubenswrapper[4780]: I0929 19:00:07.766892 4780 generic.go:334] "Generic (PLEG): container finished" podID="3c91af49-2adc-47a1-892c-82da3b338492" containerID="56cd11a363afa5285113dcd494182baca0f5cd0564a4c59d2c667f8b958be968" exitCode=0 Sep 29 19:00:07 crc kubenswrapper[4780]: I0929 19:00:07.767100 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqkx6" event={"ID":"3c91af49-2adc-47a1-892c-82da3b338492","Type":"ContainerDied","Data":"56cd11a363afa5285113dcd494182baca0f5cd0564a4c59d2c667f8b958be968"} Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.091770 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.155898 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l24ls\" (UniqueName: \"kubernetes.io/projected/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-kube-api-access-l24ls\") pod \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.156086 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-secret-volume\") pod \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.156274 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-config-volume\") pod \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\" (UID: \"4fc73004-9d27-4d7e-8082-95d33f7b0fc7\") " Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.156988 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-config-volume" (OuterVolumeSpecName: "config-volume") pod "4fc73004-9d27-4d7e-8082-95d33f7b0fc7" (UID: "4fc73004-9d27-4d7e-8082-95d33f7b0fc7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.161080 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4fc73004-9d27-4d7e-8082-95d33f7b0fc7" (UID: "4fc73004-9d27-4d7e-8082-95d33f7b0fc7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.161842 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-kube-api-access-l24ls" (OuterVolumeSpecName: "kube-api-access-l24ls") pod "4fc73004-9d27-4d7e-8082-95d33f7b0fc7" (UID: "4fc73004-9d27-4d7e-8082-95d33f7b0fc7"). InnerVolumeSpecName "kube-api-access-l24ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.258986 4780 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.259036 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l24ls\" (UniqueName: \"kubernetes.io/projected/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-kube-api-access-l24ls\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.259071 4780 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fc73004-9d27-4d7e-8082-95d33f7b0fc7-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.779764 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" event={"ID":"879d0631-a279-4337-9540-b76028b54fbc","Type":"ContainerStarted","Data":"a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8"} Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.780444 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.794017 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqkx6" event={"ID":"3c91af49-2adc-47a1-892c-82da3b338492","Type":"ContainerStarted","Data":"d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de"} Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.794107 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqkx6" event={"ID":"3c91af49-2adc-47a1-892c-82da3b338492","Type":"ContainerStarted","Data":"0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea"} Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.794414 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.794528 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.803171 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" event={"ID":"4fc73004-9d27-4d7e-8082-95d33f7b0fc7","Type":"ContainerDied","Data":"fdc9c83f57d99637abf2e138ac1b744f00b708b7ce07457821167f253d5165ae"} Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.803223 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.803226 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdc9c83f57d99637abf2e138ac1b744f00b708b7ce07457821167f253d5165ae" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.810732 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" podStartSLOduration=2.922564031 podStartE2EDuration="31.810716378s" podCreationTimestamp="2025-09-29 18:59:37 +0000 UTC" firstStartedPulling="2025-09-29 18:59:38.356366925 +0000 UTC m=+978.304664969" lastFinishedPulling="2025-09-29 19:00:07.244519272 +0000 UTC m=+1007.192817316" observedRunningTime="2025-09-29 19:00:08.808528024 +0000 UTC m=+1008.756826078" watchObservedRunningTime="2025-09-29 19:00:08.810716378 +0000 UTC m=+1008.759014422" Sep 29 19:00:08 crc kubenswrapper[4780]: I0929 19:00:08.835431 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tqkx6" podStartSLOduration=12.237324457 podStartE2EDuration="20.835414624s" podCreationTimestamp="2025-09-29 18:59:48 +0000 UTC" firstStartedPulling="2025-09-29 18:59:56.662882763 +0000 UTC m=+996.611180807" lastFinishedPulling="2025-09-29 19:00:05.26097294 +0000 UTC m=+1005.209270974" observedRunningTime="2025-09-29 19:00:08.8308302 +0000 UTC m=+1008.779128244" watchObservedRunningTime="2025-09-29 19:00:08.835414624 +0000 UTC m=+1008.783712668" Sep 29 19:00:09 crc kubenswrapper[4780]: I0929 19:00:09.813377 4780 generic.go:334] "Generic (PLEG): container finished" podID="628b549e-6d99-43d4-94bb-61b457f4c37b" containerID="8af73da64b8697605018768f5efc2298ee5aa5426fed89f61dfd4c0b10c58708" exitCode=0 Sep 29 19:00:09 crc kubenswrapper[4780]: I0929 19:00:09.813689 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"628b549e-6d99-43d4-94bb-61b457f4c37b","Type":"ContainerDied","Data":"8af73da64b8697605018768f5efc2298ee5aa5426fed89f61dfd4c0b10c58708"} Sep 29 19:00:09 crc kubenswrapper[4780]: I0929 19:00:09.817439 4780 generic.go:334] "Generic (PLEG): container finished" podID="48191511-38e9-46d2-82f8-77453769927c" containerID="463b5f829fe19e14d7099fe53489e9ad00fece5bc4c41ff73db16873402d77dc" exitCode=0 Sep 29 19:00:09 crc kubenswrapper[4780]: I0929 19:00:09.818002 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48191511-38e9-46d2-82f8-77453769927c","Type":"ContainerDied","Data":"463b5f829fe19e14d7099fe53489e9ad00fece5bc4c41ff73db16873402d77dc"} Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.829118 4780 generic.go:334] "Generic (PLEG): container finished" podID="57410855-4dd7-4552-8826-d127039e27a4" containerID="06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79" exitCode=0 Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.829227 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" event={"ID":"57410855-4dd7-4552-8826-d127039e27a4","Type":"ContainerDied","Data":"06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79"} Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.833125 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"62b9c388-0f74-42fc-bf3d-711322b976d8","Type":"ContainerStarted","Data":"1a72c9638b5649fd8982600fc6af41f0dfb4434ab14f6a9fa20981be04918d1c"} Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.840374 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8611dff0-9ad1-4bba-b687-958d7e887859","Type":"ContainerStarted","Data":"ca3350f3db78178478e71ece3c7e24a200961466f9224460cb27c414e7b48f42"} Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.842693 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ee2741-9417-4698-b550-7c596d00d271","Type":"ContainerStarted","Data":"9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3"} Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.844442 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48191511-38e9-46d2-82f8-77453769927c","Type":"ContainerStarted","Data":"f87b8bafb323301052d22ea81d2721d5221500537424fea022247a8e792a03e3"} Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.845706 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b90472c3-a09d-433c-922b-d164a11636e6","Type":"ContainerStarted","Data":"f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b"} Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.847438 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"628b549e-6d99-43d4-94bb-61b457f4c37b","Type":"ContainerStarted","Data":"ff8a529133b59522aa5a47a19801e5fe0c76dbf90cf9186ffe730d3e74db9aba"} Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.906532 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.426218033 podStartE2EDuration="20.906509671s" podCreationTimestamp="2025-09-29 18:59:50 +0000 UTC" firstStartedPulling="2025-09-29 18:59:57.746881465 +0000 UTC m=+997.695179509" lastFinishedPulling="2025-09-29 19:00:10.227173103 +0000 UTC m=+1010.175471147" observedRunningTime="2025-09-29 19:00:10.905064108 +0000 UTC m=+1010.853362152" watchObservedRunningTime="2025-09-29 19:00:10.906509671 +0000 UTC m=+1010.854807715" Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.968387 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.49954817 podStartE2EDuration="30.968366539s" podCreationTimestamp="2025-09-29 18:59:40 +0000 UTC" firstStartedPulling="2025-09-29 18:59:56.376743474 +0000 UTC m=+996.325041528" lastFinishedPulling="2025-09-29 19:00:05.845561853 +0000 UTC m=+1005.793859897" observedRunningTime="2025-09-29 19:00:10.960473497 +0000 UTC m=+1010.908771541" watchObservedRunningTime="2025-09-29 19:00:10.968366539 +0000 UTC m=+1010.916664583" Sep 29 19:00:10 crc kubenswrapper[4780]: I0929 19:00:10.985397 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.357700907 podStartE2EDuration="24.985378299s" podCreationTimestamp="2025-09-29 18:59:46 +0000 UTC" firstStartedPulling="2025-09-29 18:59:56.533903533 +0000 UTC m=+996.482201577" lastFinishedPulling="2025-09-29 19:00:10.161580925 +0000 UTC m=+1010.109878969" observedRunningTime="2025-09-29 19:00:10.982219616 +0000 UTC m=+1010.930517680" watchObservedRunningTime="2025-09-29 19:00:10.985378299 +0000 UTC m=+1010.933676343" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.009522 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.342418405 podStartE2EDuration="31.009495398s" podCreationTimestamp="2025-09-29 18:59:40 +0000 UTC" firstStartedPulling="2025-09-29 18:59:56.618249512 +0000 UTC m=+996.566547556" lastFinishedPulling="2025-09-29 19:00:05.285326505 +0000 UTC m=+1005.233624549" observedRunningTime="2025-09-29 19:00:11.004337806 +0000 UTC m=+1010.952635850" watchObservedRunningTime="2025-09-29 19:00:11.009495398 +0000 UTC m=+1010.957793432" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.498995 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8vsrs"] Sep 29 19:00:11 crc kubenswrapper[4780]: E0929 19:00:11.499718 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc73004-9d27-4d7e-8082-95d33f7b0fc7" containerName="collect-profiles" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.499738 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc73004-9d27-4d7e-8082-95d33f7b0fc7" containerName="collect-profiles" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.499931 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc73004-9d27-4d7e-8082-95d33f7b0fc7" containerName="collect-profiles" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.500581 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.508029 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.530665 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8vsrs"] Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.624672 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.640900 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhw6r\" (UniqueName: \"kubernetes.io/projected/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-kube-api-access-lhw6r\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.641003 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovn-rundir\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.641025 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovs-rundir\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.641111 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.641140 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-config\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.641291 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-combined-ca-bundle\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.657166 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-g4chk"] Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.699280 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86b869995c-stchb"] Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.701361 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.705022 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.721477 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b869995c-stchb"] Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.744386 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovn-rundir\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.744440 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovs-rundir\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.744478 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.744510 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-config\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.744558 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-combined-ca-bundle\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.744602 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhw6r\" (UniqueName: \"kubernetes.io/projected/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-kube-api-access-lhw6r\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.745312 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovn-rundir\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.745391 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovs-rundir\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.746880 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-config\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.750952 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.763888 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-combined-ca-bundle\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.767873 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhw6r\" (UniqueName: \"kubernetes.io/projected/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-kube-api-access-lhw6r\") pod \"ovn-controller-metrics-8vsrs\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.826659 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.853298 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2lnw\" (UniqueName: \"kubernetes.io/projected/9d6093ab-c0af-4e25-96db-b8a6b64ea464-kube-api-access-d2lnw\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.853421 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-dns-svc\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.853572 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-ovsdbserver-nb\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.853712 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-config\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.874410 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" event={"ID":"57410855-4dd7-4552-8826-d127039e27a4","Type":"ContainerStarted","Data":"7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06"} Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.904493 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-dhj4n"] Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.904845 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" podUID="879d0631-a279-4337-9540-b76028b54fbc" containerName="dnsmasq-dns" containerID="cri-o://a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8" gracePeriod=10 Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.914034 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.919484 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.919489 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" podUID="57410855-4dd7-4552-8826-d127039e27a4" containerName="dnsmasq-dns" containerID="cri-o://7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06" gracePeriod=10 Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.939355 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.939611 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.977446 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-dns-svc\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.977623 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-ovsdbserver-nb\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.977749 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-config\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.977871 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2lnw\" (UniqueName: \"kubernetes.io/projected/9d6093ab-c0af-4e25-96db-b8a6b64ea464-kube-api-access-d2lnw\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.979869 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-dns-svc\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.980204 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-config\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.980712 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-dpwtl"] Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.982511 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.986002 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.990748 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-dpwtl"] Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.998726 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-ovsdbserver-nb\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:11 crc kubenswrapper[4780]: I0929 19:00:11.998838 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" podStartSLOduration=-9223372000.855965 podStartE2EDuration="35.998811557s" podCreationTimestamp="2025-09-29 18:59:36 +0000 UTC" firstStartedPulling="2025-09-29 18:59:38.016822045 +0000 UTC m=+977.965120089" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:00:11.977064858 +0000 UTC m=+1011.925362902" watchObservedRunningTime="2025-09-29 19:00:11.998811557 +0000 UTC m=+1011.947109601" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.013766 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2lnw\" (UniqueName: \"kubernetes.io/projected/9d6093ab-c0af-4e25-96db-b8a6b64ea464-kube-api-access-d2lnw\") pod \"dnsmasq-dns-86b869995c-stchb\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.027691 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.039693 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.057594 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.058733 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.081508 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-config\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.081974 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-dns-svc\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.082126 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-sb\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.082252 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxc84\" (UniqueName: \"kubernetes.io/projected/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-kube-api-access-mxc84\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.082296 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.184624 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxc84\" (UniqueName: \"kubernetes.io/projected/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-kube-api-access-mxc84\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.184688 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.184756 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-config\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.184824 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-dns-svc\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.184871 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-sb\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.185896 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-sb\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.186853 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.187401 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-config\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.188029 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-dns-svc\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.207358 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxc84\" (UniqueName: \"kubernetes.io/projected/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-kube-api-access-mxc84\") pod \"dnsmasq-dns-5d86d68bf7-dpwtl\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.330745 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.433199 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.580227 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.591257 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.595291 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-config\") pod \"57410855-4dd7-4552-8826-d127039e27a4\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.595349 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrxwx\" (UniqueName: \"kubernetes.io/projected/57410855-4dd7-4552-8826-d127039e27a4-kube-api-access-zrxwx\") pod \"57410855-4dd7-4552-8826-d127039e27a4\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.595543 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-dns-svc\") pod \"57410855-4dd7-4552-8826-d127039e27a4\" (UID: \"57410855-4dd7-4552-8826-d127039e27a4\") " Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.601801 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57410855-4dd7-4552-8826-d127039e27a4-kube-api-access-zrxwx" (OuterVolumeSpecName: "kube-api-access-zrxwx") pod "57410855-4dd7-4552-8826-d127039e27a4" (UID: "57410855-4dd7-4552-8826-d127039e27a4"). InnerVolumeSpecName "kube-api-access-zrxwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.626915 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.699072 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98mzv\" (UniqueName: \"kubernetes.io/projected/879d0631-a279-4337-9540-b76028b54fbc-kube-api-access-98mzv\") pod \"879d0631-a279-4337-9540-b76028b54fbc\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.699608 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-config\") pod \"879d0631-a279-4337-9540-b76028b54fbc\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.699813 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-dns-svc\") pod \"879d0631-a279-4337-9540-b76028b54fbc\" (UID: \"879d0631-a279-4337-9540-b76028b54fbc\") " Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.703915 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrxwx\" (UniqueName: \"kubernetes.io/projected/57410855-4dd7-4552-8826-d127039e27a4-kube-api-access-zrxwx\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.709500 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/879d0631-a279-4337-9540-b76028b54fbc-kube-api-access-98mzv" (OuterVolumeSpecName: "kube-api-access-98mzv") pod "879d0631-a279-4337-9540-b76028b54fbc" (UID: "879d0631-a279-4337-9540-b76028b54fbc"). InnerVolumeSpecName "kube-api-access-98mzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.710488 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57410855-4dd7-4552-8826-d127039e27a4" (UID: "57410855-4dd7-4552-8826-d127039e27a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.729233 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.740746 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-config" (OuterVolumeSpecName: "config") pod "57410855-4dd7-4552-8826-d127039e27a4" (UID: "57410855-4dd7-4552-8826-d127039e27a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.797967 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "879d0631-a279-4337-9540-b76028b54fbc" (UID: "879d0631-a279-4337-9540-b76028b54fbc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.801265 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-config" (OuterVolumeSpecName: "config") pod "879d0631-a279-4337-9540-b76028b54fbc" (UID: "879d0631-a279-4337-9540-b76028b54fbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:12 crc kubenswrapper[4780]: W0929 19:00:12.805001 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d6093ab_c0af_4e25_96db_b8a6b64ea464.slice/crio-567bf938b7dd7d271a109b4c2b0ca8a9525132334eb7d87dc25a2c717ff8ab11 WatchSource:0}: Error finding container 567bf938b7dd7d271a109b4c2b0ca8a9525132334eb7d87dc25a2c717ff8ab11: Status 404 returned error can't find the container with id 567bf938b7dd7d271a109b4c2b0ca8a9525132334eb7d87dc25a2c717ff8ab11 Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.805919 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.805955 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.805967 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57410855-4dd7-4552-8826-d127039e27a4-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.805979 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98mzv\" (UniqueName: \"kubernetes.io/projected/879d0631-a279-4337-9540-b76028b54fbc-kube-api-access-98mzv\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.805990 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/879d0631-a279-4337-9540-b76028b54fbc-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.813570 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8vsrs"] Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.824758 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b869995c-stchb"] Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.895527 4780 generic.go:334] "Generic (PLEG): container finished" podID="57410855-4dd7-4552-8826-d127039e27a4" containerID="7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06" exitCode=0 Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.895603 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.895628 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" event={"ID":"57410855-4dd7-4552-8826-d127039e27a4","Type":"ContainerDied","Data":"7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06"} Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.897036 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b6f94bdfc-g4chk" event={"ID":"57410855-4dd7-4552-8826-d127039e27a4","Type":"ContainerDied","Data":"1932913b142923670f13248e57e239b74238476c37d99b3e0faf02ac5cdfcf18"} Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.897091 4780 scope.go:117] "RemoveContainer" containerID="7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.904185 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b869995c-stchb" event={"ID":"9d6093ab-c0af-4e25-96db-b8a6b64ea464","Type":"ContainerStarted","Data":"567bf938b7dd7d271a109b4c2b0ca8a9525132334eb7d87dc25a2c717ff8ab11"} Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.910947 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8vsrs" event={"ID":"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8","Type":"ContainerStarted","Data":"aa20fc20ab16ba8887da12486113184683f526c2c90d5e16275cee20954886fa"} Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.912396 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.919175 4780 generic.go:334] "Generic (PLEG): container finished" podID="879d0631-a279-4337-9540-b76028b54fbc" containerID="a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8" exitCode=0 Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.919251 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" event={"ID":"879d0631-a279-4337-9540-b76028b54fbc","Type":"ContainerDied","Data":"a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8"} Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.919324 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" event={"ID":"879d0631-a279-4337-9540-b76028b54fbc","Type":"ContainerDied","Data":"ff63ac8e55c1c62d692378bb70358dad3652603b7fb99885d01f4e1b48380dc2"} Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.919875 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77795d58f5-dhj4n" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.940238 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-g4chk"] Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.945444 4780 scope.go:117] "RemoveContainer" containerID="06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.947557 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b6f94bdfc-g4chk"] Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.983862 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.985636 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.992129 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-dhj4n"] Sep 29 19:00:12 crc kubenswrapper[4780]: I0929 19:00:12.999398 4780 scope.go:117] "RemoveContainer" containerID="7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06" Sep 29 19:00:13 crc kubenswrapper[4780]: E0929 19:00:13.000118 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06\": container with ID starting with 7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06 not found: ID does not exist" containerID="7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.000173 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06"} err="failed to get container status \"7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06\": rpc error: code = NotFound desc = could not find container \"7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06\": container with ID starting with 7eece398b137d2033fbdee4939fa26d232c774f2b77b7422ea72a22f26433a06 not found: ID does not exist" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.000207 4780 scope.go:117] "RemoveContainer" containerID="06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79" Sep 29 19:00:13 crc kubenswrapper[4780]: E0929 19:00:13.000637 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79\": container with ID starting with 06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79 not found: ID does not exist" containerID="06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.000662 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79"} err="failed to get container status \"06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79\": rpc error: code = NotFound desc = could not find container \"06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79\": container with ID starting with 06b60b990eaee7708ca4ef84fd7320120d5bd99272faf9467be346c92dddad79 not found: ID does not exist" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.000681 4780 scope.go:117] "RemoveContainer" containerID="a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.001366 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77795d58f5-dhj4n"] Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.051500 4780 scope.go:117] "RemoveContainer" containerID="17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.102321 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-dpwtl"] Sep 29 19:00:13 crc kubenswrapper[4780]: W0929 19:00:13.152265 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4aecfdf6_2935_4836_bd6f_ffb2829ca7bf.slice/crio-6e3396ac007bfcb0d60c429c73932a0921fde7b37378884a48ff0c2cad1f7ded WatchSource:0}: Error finding container 6e3396ac007bfcb0d60c429c73932a0921fde7b37378884a48ff0c2cad1f7ded: Status 404 returned error can't find the container with id 6e3396ac007bfcb0d60c429c73932a0921fde7b37378884a48ff0c2cad1f7ded Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.311755 4780 scope.go:117] "RemoveContainer" containerID="a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8" Sep 29 19:00:13 crc kubenswrapper[4780]: E0929 19:00:13.313077 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8\": container with ID starting with a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8 not found: ID does not exist" containerID="a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.313129 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8"} err="failed to get container status \"a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8\": rpc error: code = NotFound desc = could not find container \"a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8\": container with ID starting with a3454330fbfa072ac9bc30e11c7019bd88dd2d794b213e26c22f162fd889f0e8 not found: ID does not exist" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.313159 4780 scope.go:117] "RemoveContainer" containerID="17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1" Sep 29 19:00:13 crc kubenswrapper[4780]: E0929 19:00:13.313566 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1\": container with ID starting with 17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1 not found: ID does not exist" containerID="17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.313638 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1"} err="failed to get container status \"17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1\": rpc error: code = NotFound desc = could not find container \"17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1\": container with ID starting with 17c135ca8c4731aaff9ea952af7e09c866e523e0996cffeb1da6061f95a2edd1 not found: ID does not exist" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.384224 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 29 19:00:13 crc kubenswrapper[4780]: E0929 19:00:13.385009 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879d0631-a279-4337-9540-b76028b54fbc" containerName="init" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.389335 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="879d0631-a279-4337-9540-b76028b54fbc" containerName="init" Sep 29 19:00:13 crc kubenswrapper[4780]: E0929 19:00:13.389659 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57410855-4dd7-4552-8826-d127039e27a4" containerName="init" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.389742 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="57410855-4dd7-4552-8826-d127039e27a4" containerName="init" Sep 29 19:00:13 crc kubenswrapper[4780]: E0929 19:00:13.389832 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="879d0631-a279-4337-9540-b76028b54fbc" containerName="dnsmasq-dns" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.389915 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="879d0631-a279-4337-9540-b76028b54fbc" containerName="dnsmasq-dns" Sep 29 19:00:13 crc kubenswrapper[4780]: E0929 19:00:13.390009 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57410855-4dd7-4552-8826-d127039e27a4" containerName="dnsmasq-dns" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.390100 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="57410855-4dd7-4552-8826-d127039e27a4" containerName="dnsmasq-dns" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.390531 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="879d0631-a279-4337-9540-b76028b54fbc" containerName="dnsmasq-dns" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.390651 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="57410855-4dd7-4552-8826-d127039e27a4" containerName="dnsmasq-dns" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.391908 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.396738 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lmpft" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.397019 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.397154 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.397204 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.416651 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.520538 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.520633 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.520677 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.520728 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3683c554-eec7-4825-8972-0445faf15a23-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.520920 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-scripts\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.520962 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-config\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.520999 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m48s\" (UniqueName: \"kubernetes.io/projected/3683c554-eec7-4825-8972-0445faf15a23-kube-api-access-7m48s\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.626810 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-config\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.626906 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m48s\" (UniqueName: \"kubernetes.io/projected/3683c554-eec7-4825-8972-0445faf15a23-kube-api-access-7m48s\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.626968 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.627009 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.627072 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.627119 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3683c554-eec7-4825-8972-0445faf15a23-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.627218 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-scripts\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.628506 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-scripts\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.632368 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3683c554-eec7-4825-8972-0445faf15a23-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.637122 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-config\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.637839 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.640827 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.642456 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.659940 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m48s\" (UniqueName: \"kubernetes.io/projected/3683c554-eec7-4825-8972-0445faf15a23-kube-api-access-7m48s\") pod \"ovn-northd-0\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.712631 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.930815 4780 generic.go:334] "Generic (PLEG): container finished" podID="9d6093ab-c0af-4e25-96db-b8a6b64ea464" containerID="c913fd2444d45e63ee442106a3405e85f9c67db8ef57eec6809dfadcc9a0cd90" exitCode=0 Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.931143 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b869995c-stchb" event={"ID":"9d6093ab-c0af-4e25-96db-b8a6b64ea464","Type":"ContainerDied","Data":"c913fd2444d45e63ee442106a3405e85f9c67db8ef57eec6809dfadcc9a0cd90"} Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.934946 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8vsrs" event={"ID":"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8","Type":"ContainerStarted","Data":"13a5524647bbab0fbbfe370f379b83106caeed1a42a7ccefb228fba784b1ddf7"} Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.957700 4780 generic.go:334] "Generic (PLEG): container finished" podID="4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" containerID="5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2" exitCode=0 Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.957826 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" event={"ID":"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf","Type":"ContainerDied","Data":"5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2"} Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.957859 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" event={"ID":"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf","Type":"ContainerStarted","Data":"6e3396ac007bfcb0d60c429c73932a0921fde7b37378884a48ff0c2cad1f7ded"} Sep 29 19:00:13 crc kubenswrapper[4780]: I0929 19:00:13.992454 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8vsrs" podStartSLOduration=2.9924359369999998 podStartE2EDuration="2.992435937s" podCreationTimestamp="2025-09-29 19:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:00:13.978826987 +0000 UTC m=+1013.927125031" watchObservedRunningTime="2025-09-29 19:00:13.992435937 +0000 UTC m=+1013.940733981" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.324331 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.329885 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.390341 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-dpwtl"] Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.424911 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-xxb52"] Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.426644 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.454368 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-xxb52"] Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.545444 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phm52\" (UniqueName: \"kubernetes.io/projected/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-kube-api-access-phm52\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.545516 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-config\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.545544 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-sb\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.546569 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-dns-svc\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.546677 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-nb\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.648924 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-dns-svc\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.648995 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-nb\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.649037 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phm52\" (UniqueName: \"kubernetes.io/projected/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-kube-api-access-phm52\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.649103 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-config\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.649129 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-sb\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.650214 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-dns-svc\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.650278 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-nb\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.650318 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-sb\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.650472 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-config\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.672244 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phm52\" (UniqueName: \"kubernetes.io/projected/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-kube-api-access-phm52\") pod \"dnsmasq-dns-6c6d5d5bd7-xxb52\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.747616 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.769399 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57410855-4dd7-4552-8826-d127039e27a4" path="/var/lib/kubelet/pods/57410855-4dd7-4552-8826-d127039e27a4/volumes" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.770058 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="879d0631-a279-4337-9540-b76028b54fbc" path="/var/lib/kubelet/pods/879d0631-a279-4337-9540-b76028b54fbc/volumes" Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.987174 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3683c554-eec7-4825-8972-0445faf15a23","Type":"ContainerStarted","Data":"b201ab686e7ffc57334c1ed50e9946de9b263459962e100d7ba2be026512b575"} Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.993596 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" event={"ID":"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf","Type":"ContainerStarted","Data":"3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b"} Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.993846 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" podUID="4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" containerName="dnsmasq-dns" containerID="cri-o://3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b" gracePeriod=10 Sep 29 19:00:14 crc kubenswrapper[4780]: I0929 19:00:14.994150 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.002635 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b869995c-stchb" event={"ID":"9d6093ab-c0af-4e25-96db-b8a6b64ea464","Type":"ContainerStarted","Data":"6d22f672c7169ba430c75c14ae52cf1d425d2cd18dd06ecce05537955e65c439"} Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.023020 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" podStartSLOduration=4.022995348 podStartE2EDuration="4.022995348s" podCreationTimestamp="2025-09-29 19:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:00:15.019105104 +0000 UTC m=+1014.967403168" watchObservedRunningTime="2025-09-29 19:00:15.022995348 +0000 UTC m=+1014.971293392" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.070390 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86b869995c-stchb" podStartSLOduration=4.0703576 podStartE2EDuration="4.0703576s" podCreationTimestamp="2025-09-29 19:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:00:15.053257697 +0000 UTC m=+1015.001555741" watchObservedRunningTime="2025-09-29 19:00:15.0703576 +0000 UTC m=+1015.018655644" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.242777 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-xxb52"] Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.481123 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.563161 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 29 19:00:15 crc kubenswrapper[4780]: E0929 19:00:15.563691 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" containerName="init" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.563713 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" containerName="init" Sep 29 19:00:15 crc kubenswrapper[4780]: E0929 19:00:15.563724 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" containerName="dnsmasq-dns" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.563733 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" containerName="dnsmasq-dns" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.563999 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" containerName="dnsmasq-dns" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.571514 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.576560 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-z7xcj" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.577956 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.578326 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.578109 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.586303 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.596452 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-config\") pod \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.596570 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-nb\") pod \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.596734 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-sb\") pod \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.596751 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-dns-svc\") pod \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.596834 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxc84\" (UniqueName: \"kubernetes.io/projected/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-kube-api-access-mxc84\") pod \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\" (UID: \"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf\") " Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.614081 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-kube-api-access-mxc84" (OuterVolumeSpecName: "kube-api-access-mxc84") pod "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" (UID: "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf"). InnerVolumeSpecName "kube-api-access-mxc84". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.662751 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" (UID: "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.668625 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" (UID: "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.698699 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.698755 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.698823 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-lock\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.698844 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-cache\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.698862 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blc65\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-kube-api-access-blc65\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.698945 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxc84\" (UniqueName: \"kubernetes.io/projected/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-kube-api-access-mxc84\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.698957 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.698968 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.700752 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-config" (OuterVolumeSpecName: "config") pod "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" (UID: "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.701538 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" (UID: "4aecfdf6-2935-4836-bd6f-ffb2829ca7bf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.801541 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-lock\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.801600 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-cache\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.801628 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blc65\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-kube-api-access-blc65\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.801710 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.801739 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.801822 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.801835 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:15 crc kubenswrapper[4780]: E0929 19:00:15.801975 4780 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 19:00:15 crc kubenswrapper[4780]: E0929 19:00:15.801992 4780 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 19:00:15 crc kubenswrapper[4780]: E0929 19:00:15.802082 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift podName:d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1 nodeName:}" failed. No retries permitted until 2025-09-29 19:00:16.302058468 +0000 UTC m=+1016.250356512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift") pod "swift-storage-0" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1") : configmap "swift-ring-files" not found Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.802259 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-lock\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.802646 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-cache\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.802771 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.822079 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blc65\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-kube-api-access-blc65\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:15 crc kubenswrapper[4780]: I0929 19:00:15.831280 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.013924 4780 generic.go:334] "Generic (PLEG): container finished" podID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" containerID="b5e622091d7a9d3aaae2f9c25bf82a0e2605f1d5787caddf95c59d81115f1ece" exitCode=0 Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.014006 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" event={"ID":"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd","Type":"ContainerDied","Data":"b5e622091d7a9d3aaae2f9c25bf82a0e2605f1d5787caddf95c59d81115f1ece"} Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.014101 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" event={"ID":"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd","Type":"ContainerStarted","Data":"128bce67e17fcc25dd9f5f656be2e30b846f36c46554917117fdc55656ced194"} Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.016910 4780 generic.go:334] "Generic (PLEG): container finished" podID="4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" containerID="3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b" exitCode=0 Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.016969 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" event={"ID":"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf","Type":"ContainerDied","Data":"3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b"} Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.017028 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" event={"ID":"4aecfdf6-2935-4836-bd6f-ffb2829ca7bf","Type":"ContainerDied","Data":"6e3396ac007bfcb0d60c429c73932a0921fde7b37378884a48ff0c2cad1f7ded"} Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.017015 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d86d68bf7-dpwtl" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.017064 4780 scope.go:117] "RemoveContainer" containerID="3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.017212 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.057110 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-d9jdv"] Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.058568 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.062284 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.062586 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.062710 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.084487 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-dpwtl"] Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.087858 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-d9jdv"] Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.100787 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d86d68bf7-dpwtl"] Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.208575 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.229455 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-dispersionconf\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.229511 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-ring-data-devices\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.229730 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkjx\" (UniqueName: \"kubernetes.io/projected/5a3341c3-4401-4e61-aa4b-58943632c521-kube-api-access-5pkjx\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.229833 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5a3341c3-4401-4e61-aa4b-58943632c521-etc-swift\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.229892 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-scripts\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.229978 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-combined-ca-bundle\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.230100 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-swiftconf\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.260098 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.277692 4780 scope.go:117] "RemoveContainer" containerID="5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.331705 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkjx\" (UniqueName: \"kubernetes.io/projected/5a3341c3-4401-4e61-aa4b-58943632c521-kube-api-access-5pkjx\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.331768 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.331789 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5a3341c3-4401-4e61-aa4b-58943632c521-etc-swift\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.331811 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-scripts\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.331842 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-combined-ca-bundle\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.331863 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-swiftconf\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.331924 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-dispersionconf\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.331967 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-ring-data-devices\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: E0929 19:00:16.333801 4780 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 19:00:16 crc kubenswrapper[4780]: E0929 19:00:16.333823 4780 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 19:00:16 crc kubenswrapper[4780]: E0929 19:00:16.333855 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift podName:d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1 nodeName:}" failed. No retries permitted until 2025-09-29 19:00:17.333842379 +0000 UTC m=+1017.282140433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift") pod "swift-storage-0" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1") : configmap "swift-ring-files" not found Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.334419 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5a3341c3-4401-4e61-aa4b-58943632c521-etc-swift\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.334972 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-scripts\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.336907 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-ring-data-devices\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.339232 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-swiftconf\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.342669 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-dispersionconf\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.343274 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-combined-ca-bundle\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.359765 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkjx\" (UniqueName: \"kubernetes.io/projected/5a3341c3-4401-4e61-aa4b-58943632c521-kube-api-access-5pkjx\") pod \"swift-ring-rebalance-d9jdv\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.374420 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.430307 4780 scope.go:117] "RemoveContainer" containerID="3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b" Sep 29 19:00:16 crc kubenswrapper[4780]: E0929 19:00:16.431015 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b\": container with ID starting with 3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b not found: ID does not exist" containerID="3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.431096 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b"} err="failed to get container status \"3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b\": rpc error: code = NotFound desc = could not find container \"3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b\": container with ID starting with 3a49469d67599ed8b45ccbbff48dca3bf0f45390509d6c11e8beda0bb4d4c72b not found: ID does not exist" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.431137 4780 scope.go:117] "RemoveContainer" containerID="5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2" Sep 29 19:00:16 crc kubenswrapper[4780]: E0929 19:00:16.434221 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2\": container with ID starting with 5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2 not found: ID does not exist" containerID="5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.434280 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2"} err="failed to get container status \"5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2\": rpc error: code = NotFound desc = could not find container \"5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2\": container with ID starting with 5178c994ffe68cf76ee5c977efb85ca5f12575d0d9b5555f09c709ceb18840b2 not found: ID does not exist" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.768866 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aecfdf6-2935-4836-bd6f-ffb2829ca7bf" path="/var/lib/kubelet/pods/4aecfdf6-2935-4836-bd6f-ffb2829ca7bf/volumes" Sep 29 19:00:16 crc kubenswrapper[4780]: I0929 19:00:16.917161 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-d9jdv"] Sep 29 19:00:17 crc kubenswrapper[4780]: I0929 19:00:17.027733 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" event={"ID":"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd","Type":"ContainerStarted","Data":"8bd8acd4279307c838824b77e09b9d3989c03328fe506d375edf06758448dbaf"} Sep 29 19:00:17 crc kubenswrapper[4780]: I0929 19:00:17.029108 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:17 crc kubenswrapper[4780]: I0929 19:00:17.029218 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d9jdv" event={"ID":"5a3341c3-4401-4e61-aa4b-58943632c521","Type":"ContainerStarted","Data":"e9c3b194ace60a24626e7db2f209c67e2499c07cd552de2bc8526fed07c422e9"} Sep 29 19:00:17 crc kubenswrapper[4780]: I0929 19:00:17.030439 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3683c554-eec7-4825-8972-0445faf15a23","Type":"ContainerStarted","Data":"28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6"} Sep 29 19:00:17 crc kubenswrapper[4780]: I0929 19:00:17.046348 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" podStartSLOduration=3.046329621 podStartE2EDuration="3.046329621s" podCreationTimestamp="2025-09-29 19:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:00:17.044455686 +0000 UTC m=+1016.992753730" watchObservedRunningTime="2025-09-29 19:00:17.046329621 +0000 UTC m=+1016.994627665" Sep 29 19:00:17 crc kubenswrapper[4780]: I0929 19:00:17.353169 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:17 crc kubenswrapper[4780]: E0929 19:00:17.353488 4780 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 19:00:17 crc kubenswrapper[4780]: E0929 19:00:17.353566 4780 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 19:00:17 crc kubenswrapper[4780]: E0929 19:00:17.353675 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift podName:d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1 nodeName:}" failed. No retries permitted until 2025-09-29 19:00:19.353642474 +0000 UTC m=+1019.301940518 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift") pod "swift-storage-0" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1") : configmap "swift-ring-files" not found Sep 29 19:00:18 crc kubenswrapper[4780]: I0929 19:00:18.044259 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3683c554-eec7-4825-8972-0445faf15a23","Type":"ContainerStarted","Data":"4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083"} Sep 29 19:00:18 crc kubenswrapper[4780]: I0929 19:00:18.063240 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.992654379 podStartE2EDuration="5.06321645s" podCreationTimestamp="2025-09-29 19:00:13 +0000 UTC" firstStartedPulling="2025-09-29 19:00:14.363777962 +0000 UTC m=+1014.312076006" lastFinishedPulling="2025-09-29 19:00:16.434340033 +0000 UTC m=+1016.382638077" observedRunningTime="2025-09-29 19:00:18.062594872 +0000 UTC m=+1018.010892916" watchObservedRunningTime="2025-09-29 19:00:18.06321645 +0000 UTC m=+1018.011514504" Sep 29 19:00:18 crc kubenswrapper[4780]: I0929 19:00:18.217658 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 29 19:00:18 crc kubenswrapper[4780]: I0929 19:00:18.291242 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 29 19:00:18 crc kubenswrapper[4780]: I0929 19:00:18.713303 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 29 19:00:19 crc kubenswrapper[4780]: I0929 19:00:19.404137 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:19 crc kubenswrapper[4780]: E0929 19:00:19.404546 4780 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 19:00:19 crc kubenswrapper[4780]: E0929 19:00:19.404567 4780 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 19:00:19 crc kubenswrapper[4780]: E0929 19:00:19.404640 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift podName:d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1 nodeName:}" failed. No retries permitted until 2025-09-29 19:00:23.404616229 +0000 UTC m=+1023.352914263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift") pod "swift-storage-0" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1") : configmap "swift-ring-files" not found Sep 29 19:00:21 crc kubenswrapper[4780]: I0929 19:00:21.977187 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-xfmwd"] Sep 29 19:00:21 crc kubenswrapper[4780]: I0929 19:00:21.978886 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xfmwd" Sep 29 19:00:21 crc kubenswrapper[4780]: I0929 19:00:21.998229 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xfmwd"] Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.030295 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.065295 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29fgv\" (UniqueName: \"kubernetes.io/projected/5d1e36a5-f7ff-4c0b-b950-382d6123b571-kube-api-access-29fgv\") pod \"keystone-db-create-xfmwd\" (UID: \"5d1e36a5-f7ff-4c0b-b950-382d6123b571\") " pod="openstack/keystone-db-create-xfmwd" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.107677 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d9jdv" event={"ID":"5a3341c3-4401-4e61-aa4b-58943632c521","Type":"ContainerStarted","Data":"7699a71ca39a3c8494c9355a46a16bee6be5ac87a42e0ea8defff31cba877178"} Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.164709 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-d9jdv" podStartSLOduration=1.547803171 podStartE2EDuration="6.164682996s" podCreationTimestamp="2025-09-29 19:00:16 +0000 UTC" firstStartedPulling="2025-09-29 19:00:16.924782259 +0000 UTC m=+1016.873080303" lastFinishedPulling="2025-09-29 19:00:21.541662084 +0000 UTC m=+1021.489960128" observedRunningTime="2025-09-29 19:00:22.15902691 +0000 UTC m=+1022.107324954" watchObservedRunningTime="2025-09-29 19:00:22.164682996 +0000 UTC m=+1022.112981050" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.169184 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29fgv\" (UniqueName: \"kubernetes.io/projected/5d1e36a5-f7ff-4c0b-b950-382d6123b571-kube-api-access-29fgv\") pod \"keystone-db-create-xfmwd\" (UID: \"5d1e36a5-f7ff-4c0b-b950-382d6123b571\") " pod="openstack/keystone-db-create-xfmwd" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.202760 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29fgv\" (UniqueName: \"kubernetes.io/projected/5d1e36a5-f7ff-4c0b-b950-382d6123b571-kube-api-access-29fgv\") pod \"keystone-db-create-xfmwd\" (UID: \"5d1e36a5-f7ff-4c0b-b950-382d6123b571\") " pod="openstack/keystone-db-create-xfmwd" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.217680 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4slf2"] Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.218698 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4slf2" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.241200 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4slf2"] Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.272215 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8kcr\" (UniqueName: \"kubernetes.io/projected/19fa1e2b-3a9e-4e88-97e6-9751eb595e01-kube-api-access-p8kcr\") pod \"placement-db-create-4slf2\" (UID: \"19fa1e2b-3a9e-4e88-97e6-9751eb595e01\") " pod="openstack/placement-db-create-4slf2" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.309463 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xfmwd" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.374998 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8kcr\" (UniqueName: \"kubernetes.io/projected/19fa1e2b-3a9e-4e88-97e6-9751eb595e01-kube-api-access-p8kcr\") pod \"placement-db-create-4slf2\" (UID: \"19fa1e2b-3a9e-4e88-97e6-9751eb595e01\") " pod="openstack/placement-db-create-4slf2" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.398117 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8kcr\" (UniqueName: \"kubernetes.io/projected/19fa1e2b-3a9e-4e88-97e6-9751eb595e01-kube-api-access-p8kcr\") pod \"placement-db-create-4slf2\" (UID: \"19fa1e2b-3a9e-4e88-97e6-9751eb595e01\") " pod="openstack/placement-db-create-4slf2" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.579982 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4slf2" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.613270 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-h6w2d"] Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.614326 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h6w2d" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.639242 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-h6w2d"] Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.681650 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvplq\" (UniqueName: \"kubernetes.io/projected/2aa811a8-3957-47f8-a24c-e307ece95cf2-kube-api-access-xvplq\") pod \"glance-db-create-h6w2d\" (UID: \"2aa811a8-3957-47f8-a24c-e307ece95cf2\") " pod="openstack/glance-db-create-h6w2d" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.735489 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xfmwd"] Sep 29 19:00:22 crc kubenswrapper[4780]: W0929 19:00:22.740427 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d1e36a5_f7ff_4c0b_b950_382d6123b571.slice/crio-e556f4c131e1c2becc120b0d0e47de1435cd2c33358f1fe62775b3a8ac5eca14 WatchSource:0}: Error finding container e556f4c131e1c2becc120b0d0e47de1435cd2c33358f1fe62775b3a8ac5eca14: Status 404 returned error can't find the container with id e556f4c131e1c2becc120b0d0e47de1435cd2c33358f1fe62775b3a8ac5eca14 Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.785263 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvplq\" (UniqueName: \"kubernetes.io/projected/2aa811a8-3957-47f8-a24c-e307ece95cf2-kube-api-access-xvplq\") pod \"glance-db-create-h6w2d\" (UID: \"2aa811a8-3957-47f8-a24c-e307ece95cf2\") " pod="openstack/glance-db-create-h6w2d" Sep 29 19:00:22 crc kubenswrapper[4780]: I0929 19:00:22.849907 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvplq\" (UniqueName: \"kubernetes.io/projected/2aa811a8-3957-47f8-a24c-e307ece95cf2-kube-api-access-xvplq\") pod \"glance-db-create-h6w2d\" (UID: \"2aa811a8-3957-47f8-a24c-e307ece95cf2\") " pod="openstack/glance-db-create-h6w2d" Sep 29 19:00:23 crc kubenswrapper[4780]: I0929 19:00:23.027876 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h6w2d" Sep 29 19:00:23 crc kubenswrapper[4780]: I0929 19:00:23.126486 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xfmwd" event={"ID":"5d1e36a5-f7ff-4c0b-b950-382d6123b571","Type":"ContainerStarted","Data":"e556f4c131e1c2becc120b0d0e47de1435cd2c33358f1fe62775b3a8ac5eca14"} Sep 29 19:00:23 crc kubenswrapper[4780]: I0929 19:00:23.152623 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4slf2"] Sep 29 19:00:23 crc kubenswrapper[4780]: W0929 19:00:23.181459 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19fa1e2b_3a9e_4e88_97e6_9751eb595e01.slice/crio-f01ccb7f0a4610359ebe61a2bfb0b759a2294c7b07feb25f331eb1e1e12d767b WatchSource:0}: Error finding container f01ccb7f0a4610359ebe61a2bfb0b759a2294c7b07feb25f331eb1e1e12d767b: Status 404 returned error can't find the container with id f01ccb7f0a4610359ebe61a2bfb0b759a2294c7b07feb25f331eb1e1e12d767b Sep 29 19:00:23 crc kubenswrapper[4780]: W0929 19:00:23.503674 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2aa811a8_3957_47f8_a24c_e307ece95cf2.slice/crio-8ec6d92de37d6f66d9709b8d2246a3692ec43587ffb096160ee0a85780a1c165 WatchSource:0}: Error finding container 8ec6d92de37d6f66d9709b8d2246a3692ec43587ffb096160ee0a85780a1c165: Status 404 returned error can't find the container with id 8ec6d92de37d6f66d9709b8d2246a3692ec43587ffb096160ee0a85780a1c165 Sep 29 19:00:23 crc kubenswrapper[4780]: E0929 19:00:23.504938 4780 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 29 19:00:23 crc kubenswrapper[4780]: E0929 19:00:23.507370 4780 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 29 19:00:23 crc kubenswrapper[4780]: I0929 19:00:23.504816 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:23 crc kubenswrapper[4780]: E0929 19:00:23.507434 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift podName:d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1 nodeName:}" failed. No retries permitted until 2025-09-29 19:00:31.507411534 +0000 UTC m=+1031.455709588 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift") pod "swift-storage-0" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1") : configmap "swift-ring-files" not found Sep 29 19:00:23 crc kubenswrapper[4780]: I0929 19:00:23.510673 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-h6w2d"] Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.136136 4780 generic.go:334] "Generic (PLEG): container finished" podID="5d1e36a5-f7ff-4c0b-b950-382d6123b571" containerID="51ea86d45c906caef27896e8dcb0cc239773856c5d0c5bb99056af02148d0f04" exitCode=0 Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.136211 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xfmwd" event={"ID":"5d1e36a5-f7ff-4c0b-b950-382d6123b571","Type":"ContainerDied","Data":"51ea86d45c906caef27896e8dcb0cc239773856c5d0c5bb99056af02148d0f04"} Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.138027 4780 generic.go:334] "Generic (PLEG): container finished" podID="2aa811a8-3957-47f8-a24c-e307ece95cf2" containerID="c1a07f5d0a702a2af8115f7cb08035b3389648b80d070cec59a16da14f20fe62" exitCode=0 Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.138107 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h6w2d" event={"ID":"2aa811a8-3957-47f8-a24c-e307ece95cf2","Type":"ContainerDied","Data":"c1a07f5d0a702a2af8115f7cb08035b3389648b80d070cec59a16da14f20fe62"} Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.138126 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h6w2d" event={"ID":"2aa811a8-3957-47f8-a24c-e307ece95cf2","Type":"ContainerStarted","Data":"8ec6d92de37d6f66d9709b8d2246a3692ec43587ffb096160ee0a85780a1c165"} Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.140340 4780 generic.go:334] "Generic (PLEG): container finished" podID="19fa1e2b-3a9e-4e88-97e6-9751eb595e01" containerID="9b3b49adc067cecc8ca3e59bbfde4c5f491295762d4a36a1f8d029d61c87f0ca" exitCode=0 Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.140428 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4slf2" event={"ID":"19fa1e2b-3a9e-4e88-97e6-9751eb595e01","Type":"ContainerDied","Data":"9b3b49adc067cecc8ca3e59bbfde4c5f491295762d4a36a1f8d029d61c87f0ca"} Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.140469 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4slf2" event={"ID":"19fa1e2b-3a9e-4e88-97e6-9751eb595e01","Type":"ContainerStarted","Data":"f01ccb7f0a4610359ebe61a2bfb0b759a2294c7b07feb25f331eb1e1e12d767b"} Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.749479 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.821921 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b869995c-stchb"] Sep 29 19:00:24 crc kubenswrapper[4780]: I0929 19:00:24.827338 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86b869995c-stchb" podUID="9d6093ab-c0af-4e25-96db-b8a6b64ea464" containerName="dnsmasq-dns" containerID="cri-o://6d22f672c7169ba430c75c14ae52cf1d425d2cd18dd06ecce05537955e65c439" gracePeriod=10 Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.155075 4780 generic.go:334] "Generic (PLEG): container finished" podID="9d6093ab-c0af-4e25-96db-b8a6b64ea464" containerID="6d22f672c7169ba430c75c14ae52cf1d425d2cd18dd06ecce05537955e65c439" exitCode=0 Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.155326 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b869995c-stchb" event={"ID":"9d6093ab-c0af-4e25-96db-b8a6b64ea464","Type":"ContainerDied","Data":"6d22f672c7169ba430c75c14ae52cf1d425d2cd18dd06ecce05537955e65c439"} Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.304656 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.402372 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-ovsdbserver-nb\") pod \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.402438 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-dns-svc\") pod \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.402478 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-config\") pod \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.402683 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2lnw\" (UniqueName: \"kubernetes.io/projected/9d6093ab-c0af-4e25-96db-b8a6b64ea464-kube-api-access-d2lnw\") pod \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\" (UID: \"9d6093ab-c0af-4e25-96db-b8a6b64ea464\") " Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.414371 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6093ab-c0af-4e25-96db-b8a6b64ea464-kube-api-access-d2lnw" (OuterVolumeSpecName: "kube-api-access-d2lnw") pod "9d6093ab-c0af-4e25-96db-b8a6b64ea464" (UID: "9d6093ab-c0af-4e25-96db-b8a6b64ea464"). InnerVolumeSpecName "kube-api-access-d2lnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.452774 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d6093ab-c0af-4e25-96db-b8a6b64ea464" (UID: "9d6093ab-c0af-4e25-96db-b8a6b64ea464"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.470584 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-config" (OuterVolumeSpecName: "config") pod "9d6093ab-c0af-4e25-96db-b8a6b64ea464" (UID: "9d6093ab-c0af-4e25-96db-b8a6b64ea464"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.488887 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d6093ab-c0af-4e25-96db-b8a6b64ea464" (UID: "9d6093ab-c0af-4e25-96db-b8a6b64ea464"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.513325 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2lnw\" (UniqueName: \"kubernetes.io/projected/9d6093ab-c0af-4e25-96db-b8a6b64ea464-kube-api-access-d2lnw\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.513376 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.513400 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.513413 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6093ab-c0af-4e25-96db-b8a6b64ea464-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.566306 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4slf2" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.582605 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xfmwd" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.628955 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h6w2d" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.715591 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8kcr\" (UniqueName: \"kubernetes.io/projected/19fa1e2b-3a9e-4e88-97e6-9751eb595e01-kube-api-access-p8kcr\") pod \"19fa1e2b-3a9e-4e88-97e6-9751eb595e01\" (UID: \"19fa1e2b-3a9e-4e88-97e6-9751eb595e01\") " Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.715717 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29fgv\" (UniqueName: \"kubernetes.io/projected/5d1e36a5-f7ff-4c0b-b950-382d6123b571-kube-api-access-29fgv\") pod \"5d1e36a5-f7ff-4c0b-b950-382d6123b571\" (UID: \"5d1e36a5-f7ff-4c0b-b950-382d6123b571\") " Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.719522 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fa1e2b-3a9e-4e88-97e6-9751eb595e01-kube-api-access-p8kcr" (OuterVolumeSpecName: "kube-api-access-p8kcr") pod "19fa1e2b-3a9e-4e88-97e6-9751eb595e01" (UID: "19fa1e2b-3a9e-4e88-97e6-9751eb595e01"). InnerVolumeSpecName "kube-api-access-p8kcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.719681 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1e36a5-f7ff-4c0b-b950-382d6123b571-kube-api-access-29fgv" (OuterVolumeSpecName: "kube-api-access-29fgv") pod "5d1e36a5-f7ff-4c0b-b950-382d6123b571" (UID: "5d1e36a5-f7ff-4c0b-b950-382d6123b571"). InnerVolumeSpecName "kube-api-access-29fgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.817014 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvplq\" (UniqueName: \"kubernetes.io/projected/2aa811a8-3957-47f8-a24c-e307ece95cf2-kube-api-access-xvplq\") pod \"2aa811a8-3957-47f8-a24c-e307ece95cf2\" (UID: \"2aa811a8-3957-47f8-a24c-e307ece95cf2\") " Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.818086 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8kcr\" (UniqueName: \"kubernetes.io/projected/19fa1e2b-3a9e-4e88-97e6-9751eb595e01-kube-api-access-p8kcr\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.818118 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29fgv\" (UniqueName: \"kubernetes.io/projected/5d1e36a5-f7ff-4c0b-b950-382d6123b571-kube-api-access-29fgv\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.823335 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa811a8-3957-47f8-a24c-e307ece95cf2-kube-api-access-xvplq" (OuterVolumeSpecName: "kube-api-access-xvplq") pod "2aa811a8-3957-47f8-a24c-e307ece95cf2" (UID: "2aa811a8-3957-47f8-a24c-e307ece95cf2"). InnerVolumeSpecName "kube-api-access-xvplq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:25 crc kubenswrapper[4780]: I0929 19:00:25.920593 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvplq\" (UniqueName: \"kubernetes.io/projected/2aa811a8-3957-47f8-a24c-e307ece95cf2-kube-api-access-xvplq\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.165802 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xfmwd" event={"ID":"5d1e36a5-f7ff-4c0b-b950-382d6123b571","Type":"ContainerDied","Data":"e556f4c131e1c2becc120b0d0e47de1435cd2c33358f1fe62775b3a8ac5eca14"} Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.165848 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e556f4c131e1c2becc120b0d0e47de1435cd2c33358f1fe62775b3a8ac5eca14" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.165908 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xfmwd" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.171705 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b869995c-stchb" event={"ID":"9d6093ab-c0af-4e25-96db-b8a6b64ea464","Type":"ContainerDied","Data":"567bf938b7dd7d271a109b4c2b0ca8a9525132334eb7d87dc25a2c717ff8ab11"} Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.171759 4780 scope.go:117] "RemoveContainer" containerID="6d22f672c7169ba430c75c14ae52cf1d425d2cd18dd06ecce05537955e65c439" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.171888 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b869995c-stchb" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.175888 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-h6w2d" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.175936 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-h6w2d" event={"ID":"2aa811a8-3957-47f8-a24c-e307ece95cf2","Type":"ContainerDied","Data":"8ec6d92de37d6f66d9709b8d2246a3692ec43587ffb096160ee0a85780a1c165"} Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.176030 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ec6d92de37d6f66d9709b8d2246a3692ec43587ffb096160ee0a85780a1c165" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.182699 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4slf2" event={"ID":"19fa1e2b-3a9e-4e88-97e6-9751eb595e01","Type":"ContainerDied","Data":"f01ccb7f0a4610359ebe61a2bfb0b759a2294c7b07feb25f331eb1e1e12d767b"} Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.182791 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f01ccb7f0a4610359ebe61a2bfb0b759a2294c7b07feb25f331eb1e1e12d767b" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.182907 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4slf2" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.222440 4780 scope.go:117] "RemoveContainer" containerID="c913fd2444d45e63ee442106a3405e85f9c67db8ef57eec6809dfadcc9a0cd90" Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.222686 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b869995c-stchb"] Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.230815 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86b869995c-stchb"] Sep 29 19:00:26 crc kubenswrapper[4780]: I0929 19:00:26.765626 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d6093ab-c0af-4e25-96db-b8a6b64ea464" path="/var/lib/kubelet/pods/9d6093ab-c0af-4e25-96db-b8a6b64ea464/volumes" Sep 29 19:00:28 crc kubenswrapper[4780]: I0929 19:00:28.792448 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 29 19:00:30 crc kubenswrapper[4780]: I0929 19:00:30.218091 4780 generic.go:334] "Generic (PLEG): container finished" podID="5a3341c3-4401-4e61-aa4b-58943632c521" containerID="7699a71ca39a3c8494c9355a46a16bee6be5ac87a42e0ea8defff31cba877178" exitCode=0 Sep 29 19:00:30 crc kubenswrapper[4780]: I0929 19:00:30.218149 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d9jdv" event={"ID":"5a3341c3-4401-4e61-aa4b-58943632c521","Type":"ContainerDied","Data":"7699a71ca39a3c8494c9355a46a16bee6be5ac87a42e0ea8defff31cba877178"} Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.520819 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.529673 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") pod \"swift-storage-0\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " pod="openstack/swift-storage-0" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.577911 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.622864 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-ring-data-devices\") pod \"5a3341c3-4401-4e61-aa4b-58943632c521\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.622964 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5a3341c3-4401-4e61-aa4b-58943632c521-etc-swift\") pod \"5a3341c3-4401-4e61-aa4b-58943632c521\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.623005 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-scripts\") pod \"5a3341c3-4401-4e61-aa4b-58943632c521\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.623086 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-combined-ca-bundle\") pod \"5a3341c3-4401-4e61-aa4b-58943632c521\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.623169 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-swiftconf\") pod \"5a3341c3-4401-4e61-aa4b-58943632c521\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.624638 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a3341c3-4401-4e61-aa4b-58943632c521-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5a3341c3-4401-4e61-aa4b-58943632c521" (UID: "5a3341c3-4401-4e61-aa4b-58943632c521"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.625532 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5a3341c3-4401-4e61-aa4b-58943632c521" (UID: "5a3341c3-4401-4e61-aa4b-58943632c521"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.652604 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a3341c3-4401-4e61-aa4b-58943632c521" (UID: "5a3341c3-4401-4e61-aa4b-58943632c521"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.655711 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-scripts" (OuterVolumeSpecName: "scripts") pod "5a3341c3-4401-4e61-aa4b-58943632c521" (UID: "5a3341c3-4401-4e61-aa4b-58943632c521"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.662130 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5a3341c3-4401-4e61-aa4b-58943632c521" (UID: "5a3341c3-4401-4e61-aa4b-58943632c521"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.724366 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-dispersionconf\") pod \"5a3341c3-4401-4e61-aa4b-58943632c521\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.724474 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pkjx\" (UniqueName: \"kubernetes.io/projected/5a3341c3-4401-4e61-aa4b-58943632c521-kube-api-access-5pkjx\") pod \"5a3341c3-4401-4e61-aa4b-58943632c521\" (UID: \"5a3341c3-4401-4e61-aa4b-58943632c521\") " Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.724870 4780 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5a3341c3-4401-4e61-aa4b-58943632c521-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.724891 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.724900 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.724912 4780 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.724921 4780 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5a3341c3-4401-4e61-aa4b-58943632c521-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.728205 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3341c3-4401-4e61-aa4b-58943632c521-kube-api-access-5pkjx" (OuterVolumeSpecName: "kube-api-access-5pkjx") pod "5a3341c3-4401-4e61-aa4b-58943632c521" (UID: "5a3341c3-4401-4e61-aa4b-58943632c521"). InnerVolumeSpecName "kube-api-access-5pkjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.730404 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5a3341c3-4401-4e61-aa4b-58943632c521" (UID: "5a3341c3-4401-4e61-aa4b-58943632c521"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.792839 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.827863 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pkjx\" (UniqueName: \"kubernetes.io/projected/5a3341c3-4401-4e61-aa4b-58943632c521-kube-api-access-5pkjx\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.827923 4780 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5a3341c3-4401-4e61-aa4b-58943632c521-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.994640 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-928f-account-create-wfndn"] Sep 29 19:00:31 crc kubenswrapper[4780]: E0929 19:00:31.995572 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6093ab-c0af-4e25-96db-b8a6b64ea464" containerName="dnsmasq-dns" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.995589 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6093ab-c0af-4e25-96db-b8a6b64ea464" containerName="dnsmasq-dns" Sep 29 19:00:31 crc kubenswrapper[4780]: E0929 19:00:31.995624 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1e36a5-f7ff-4c0b-b950-382d6123b571" containerName="mariadb-database-create" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.995633 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1e36a5-f7ff-4c0b-b950-382d6123b571" containerName="mariadb-database-create" Sep 29 19:00:31 crc kubenswrapper[4780]: E0929 19:00:31.995655 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fa1e2b-3a9e-4e88-97e6-9751eb595e01" containerName="mariadb-database-create" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.995667 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fa1e2b-3a9e-4e88-97e6-9751eb595e01" containerName="mariadb-database-create" Sep 29 19:00:31 crc kubenswrapper[4780]: E0929 19:00:31.995684 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa811a8-3957-47f8-a24c-e307ece95cf2" containerName="mariadb-database-create" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.995691 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa811a8-3957-47f8-a24c-e307ece95cf2" containerName="mariadb-database-create" Sep 29 19:00:31 crc kubenswrapper[4780]: E0929 19:00:31.995711 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3341c3-4401-4e61-aa4b-58943632c521" containerName="swift-ring-rebalance" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.996730 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3341c3-4401-4e61-aa4b-58943632c521" containerName="swift-ring-rebalance" Sep 29 19:00:31 crc kubenswrapper[4780]: E0929 19:00:31.996751 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6093ab-c0af-4e25-96db-b8a6b64ea464" containerName="init" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.996763 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6093ab-c0af-4e25-96db-b8a6b64ea464" containerName="init" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.996987 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fa1e2b-3a9e-4e88-97e6-9751eb595e01" containerName="mariadb-database-create" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.997014 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1e36a5-f7ff-4c0b-b950-382d6123b571" containerName="mariadb-database-create" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.997027 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3341c3-4401-4e61-aa4b-58943632c521" containerName="swift-ring-rebalance" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.997039 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa811a8-3957-47f8-a24c-e307ece95cf2" containerName="mariadb-database-create" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.997069 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d6093ab-c0af-4e25-96db-b8a6b64ea464" containerName="dnsmasq-dns" Sep 29 19:00:31 crc kubenswrapper[4780]: I0929 19:00:31.997780 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-928f-account-create-wfndn" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.008872 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.014964 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-928f-account-create-wfndn"] Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.133078 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zgw\" (UniqueName: \"kubernetes.io/projected/e81be594-85ee-4be4-8afd-ec5580651ec7-kube-api-access-q6zgw\") pod \"keystone-928f-account-create-wfndn\" (UID: \"e81be594-85ee-4be4-8afd-ec5580651ec7\") " pod="openstack/keystone-928f-account-create-wfndn" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.234399 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d9jdv" event={"ID":"5a3341c3-4401-4e61-aa4b-58943632c521","Type":"ContainerDied","Data":"e9c3b194ace60a24626e7db2f209c67e2499c07cd552de2bc8526fed07c422e9"} Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.234440 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d9jdv" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.234441 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9c3b194ace60a24626e7db2f209c67e2499c07cd552de2bc8526fed07c422e9" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.234418 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zgw\" (UniqueName: \"kubernetes.io/projected/e81be594-85ee-4be4-8afd-ec5580651ec7-kube-api-access-q6zgw\") pod \"keystone-928f-account-create-wfndn\" (UID: \"e81be594-85ee-4be4-8afd-ec5580651ec7\") " pod="openstack/keystone-928f-account-create-wfndn" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.295273 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zgw\" (UniqueName: \"kubernetes.io/projected/e81be594-85ee-4be4-8afd-ec5580651ec7-kube-api-access-q6zgw\") pod \"keystone-928f-account-create-wfndn\" (UID: \"e81be594-85ee-4be4-8afd-ec5580651ec7\") " pod="openstack/keystone-928f-account-create-wfndn" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.330964 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-928f-account-create-wfndn" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.401680 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b644-account-create-4hb8n"] Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.404097 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b644-account-create-4hb8n" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.408754 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.421178 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b644-account-create-4hb8n"] Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.452822 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 29 19:00:32 crc kubenswrapper[4780]: W0929 19:00:32.460234 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3e6e9f4_21b9_4a8f_aa01_3f0924013fe1.slice/crio-67a95d1bfbd9000e22869418dd0095b23f2601ac1a14e9957d34dae1a5b2a75e WatchSource:0}: Error finding container 67a95d1bfbd9000e22869418dd0095b23f2601ac1a14e9957d34dae1a5b2a75e: Status 404 returned error can't find the container with id 67a95d1bfbd9000e22869418dd0095b23f2601ac1a14e9957d34dae1a5b2a75e Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.540142 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4mxz\" (UniqueName: \"kubernetes.io/projected/90114ff5-5dc3-4755-be92-df3f1f7a12f0-kube-api-access-s4mxz\") pod \"placement-b644-account-create-4hb8n\" (UID: \"90114ff5-5dc3-4755-be92-df3f1f7a12f0\") " pod="openstack/placement-b644-account-create-4hb8n" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.642288 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4mxz\" (UniqueName: \"kubernetes.io/projected/90114ff5-5dc3-4755-be92-df3f1f7a12f0-kube-api-access-s4mxz\") pod \"placement-b644-account-create-4hb8n\" (UID: \"90114ff5-5dc3-4755-be92-df3f1f7a12f0\") " pod="openstack/placement-b644-account-create-4hb8n" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.666211 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4mxz\" (UniqueName: \"kubernetes.io/projected/90114ff5-5dc3-4755-be92-df3f1f7a12f0-kube-api-access-s4mxz\") pod \"placement-b644-account-create-4hb8n\" (UID: \"90114ff5-5dc3-4755-be92-df3f1f7a12f0\") " pod="openstack/placement-b644-account-create-4hb8n" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.706518 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-fda4-account-create-xncgz"] Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.707718 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fda4-account-create-xncgz" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.711399 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.717680 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fda4-account-create-xncgz"] Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.725077 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b644-account-create-4hb8n" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.846171 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kggsj\" (UniqueName: \"kubernetes.io/projected/bb4b7df7-18ee-4c71-b7a3-56d799f45bf9-kube-api-access-kggsj\") pod \"glance-fda4-account-create-xncgz\" (UID: \"bb4b7df7-18ee-4c71-b7a3-56d799f45bf9\") " pod="openstack/glance-fda4-account-create-xncgz" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.897508 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-928f-account-create-wfndn"] Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.948380 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kggsj\" (UniqueName: \"kubernetes.io/projected/bb4b7df7-18ee-4c71-b7a3-56d799f45bf9-kube-api-access-kggsj\") pod \"glance-fda4-account-create-xncgz\" (UID: \"bb4b7df7-18ee-4c71-b7a3-56d799f45bf9\") " pod="openstack/glance-fda4-account-create-xncgz" Sep 29 19:00:32 crc kubenswrapper[4780]: I0929 19:00:32.974577 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kggsj\" (UniqueName: \"kubernetes.io/projected/bb4b7df7-18ee-4c71-b7a3-56d799f45bf9-kube-api-access-kggsj\") pod \"glance-fda4-account-create-xncgz\" (UID: \"bb4b7df7-18ee-4c71-b7a3-56d799f45bf9\") " pod="openstack/glance-fda4-account-create-xncgz" Sep 29 19:00:33 crc kubenswrapper[4780]: I0929 19:00:33.026654 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fda4-account-create-xncgz" Sep 29 19:00:33 crc kubenswrapper[4780]: I0929 19:00:33.251540 4780 generic.go:334] "Generic (PLEG): container finished" podID="e81be594-85ee-4be4-8afd-ec5580651ec7" containerID="9745fc2edb04b632dafc0c294af08ddefea82649ef893b0c2f3e77cb7ff298cf" exitCode=0 Sep 29 19:00:33 crc kubenswrapper[4780]: I0929 19:00:33.251622 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-928f-account-create-wfndn" event={"ID":"e81be594-85ee-4be4-8afd-ec5580651ec7","Type":"ContainerDied","Data":"9745fc2edb04b632dafc0c294af08ddefea82649ef893b0c2f3e77cb7ff298cf"} Sep 29 19:00:33 crc kubenswrapper[4780]: I0929 19:00:33.251659 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-928f-account-create-wfndn" event={"ID":"e81be594-85ee-4be4-8afd-ec5580651ec7","Type":"ContainerStarted","Data":"3afce32034beb25584d1245736e893cbd490946cc6b4964f2f31dc14e5237b3d"} Sep 29 19:00:33 crc kubenswrapper[4780]: I0929 19:00:33.254368 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"67a95d1bfbd9000e22869418dd0095b23f2601ac1a14e9957d34dae1a5b2a75e"} Sep 29 19:00:33 crc kubenswrapper[4780]: I0929 19:00:33.371672 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b644-account-create-4hb8n"] Sep 29 19:00:33 crc kubenswrapper[4780]: I0929 19:00:33.504295 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fda4-account-create-xncgz"] Sep 29 19:00:33 crc kubenswrapper[4780]: W0929 19:00:33.655083 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb4b7df7_18ee_4c71_b7a3_56d799f45bf9.slice/crio-b1133afe88247b5d0e1ab887d25783b2e5c1fec73b0df8877852e22acb50023e WatchSource:0}: Error finding container b1133afe88247b5d0e1ab887d25783b2e5c1fec73b0df8877852e22acb50023e: Status 404 returned error can't find the container with id b1133afe88247b5d0e1ab887d25783b2e5c1fec73b0df8877852e22acb50023e Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.265181 4780 generic.go:334] "Generic (PLEG): container finished" podID="90114ff5-5dc3-4755-be92-df3f1f7a12f0" containerID="5cc733f5ad77b86d578f66e4023e8bf1bfb45a2b08306550d0ed413b92d6dfaf" exitCode=0 Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.265268 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b644-account-create-4hb8n" event={"ID":"90114ff5-5dc3-4755-be92-df3f1f7a12f0","Type":"ContainerDied","Data":"5cc733f5ad77b86d578f66e4023e8bf1bfb45a2b08306550d0ed413b92d6dfaf"} Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.265297 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b644-account-create-4hb8n" event={"ID":"90114ff5-5dc3-4755-be92-df3f1f7a12f0","Type":"ContainerStarted","Data":"14b43c7feaf0fab1136533e3fc3f4331b56a01a990d8aeefb0c071d1704336fd"} Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.269612 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b"} Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.269663 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f"} Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.271466 4780 generic.go:334] "Generic (PLEG): container finished" podID="bb4b7df7-18ee-4c71-b7a3-56d799f45bf9" containerID="b9bc8661e57232e341261e8e572a959de5b4986fba652b9ef4160b4a3d945a5f" exitCode=0 Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.271539 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fda4-account-create-xncgz" event={"ID":"bb4b7df7-18ee-4c71-b7a3-56d799f45bf9","Type":"ContainerDied","Data":"b9bc8661e57232e341261e8e572a959de5b4986fba652b9ef4160b4a3d945a5f"} Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.271564 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fda4-account-create-xncgz" event={"ID":"bb4b7df7-18ee-4c71-b7a3-56d799f45bf9","Type":"ContainerStarted","Data":"b1133afe88247b5d0e1ab887d25783b2e5c1fec73b0df8877852e22acb50023e"} Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.649535 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-928f-account-create-wfndn" Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.701946 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6zgw\" (UniqueName: \"kubernetes.io/projected/e81be594-85ee-4be4-8afd-ec5580651ec7-kube-api-access-q6zgw\") pod \"e81be594-85ee-4be4-8afd-ec5580651ec7\" (UID: \"e81be594-85ee-4be4-8afd-ec5580651ec7\") " Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.709850 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81be594-85ee-4be4-8afd-ec5580651ec7-kube-api-access-q6zgw" (OuterVolumeSpecName: "kube-api-access-q6zgw") pod "e81be594-85ee-4be4-8afd-ec5580651ec7" (UID: "e81be594-85ee-4be4-8afd-ec5580651ec7"). InnerVolumeSpecName "kube-api-access-q6zgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:34 crc kubenswrapper[4780]: I0929 19:00:34.803790 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6zgw\" (UniqueName: \"kubernetes.io/projected/e81be594-85ee-4be4-8afd-ec5580651ec7-kube-api-access-q6zgw\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.282492 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-928f-account-create-wfndn" Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.282488 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-928f-account-create-wfndn" event={"ID":"e81be594-85ee-4be4-8afd-ec5580651ec7","Type":"ContainerDied","Data":"3afce32034beb25584d1245736e893cbd490946cc6b4964f2f31dc14e5237b3d"} Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.283023 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3afce32034beb25584d1245736e893cbd490946cc6b4964f2f31dc14e5237b3d" Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.285444 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61"} Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.285493 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1"} Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.649501 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b644-account-create-4hb8n" Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.656466 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fda4-account-create-xncgz" Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.719546 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4mxz\" (UniqueName: \"kubernetes.io/projected/90114ff5-5dc3-4755-be92-df3f1f7a12f0-kube-api-access-s4mxz\") pod \"90114ff5-5dc3-4755-be92-df3f1f7a12f0\" (UID: \"90114ff5-5dc3-4755-be92-df3f1f7a12f0\") " Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.719603 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kggsj\" (UniqueName: \"kubernetes.io/projected/bb4b7df7-18ee-4c71-b7a3-56d799f45bf9-kube-api-access-kggsj\") pod \"bb4b7df7-18ee-4c71-b7a3-56d799f45bf9\" (UID: \"bb4b7df7-18ee-4c71-b7a3-56d799f45bf9\") " Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.724253 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90114ff5-5dc3-4755-be92-df3f1f7a12f0-kube-api-access-s4mxz" (OuterVolumeSpecName: "kube-api-access-s4mxz") pod "90114ff5-5dc3-4755-be92-df3f1f7a12f0" (UID: "90114ff5-5dc3-4755-be92-df3f1f7a12f0"). InnerVolumeSpecName "kube-api-access-s4mxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.727185 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4b7df7-18ee-4c71-b7a3-56d799f45bf9-kube-api-access-kggsj" (OuterVolumeSpecName: "kube-api-access-kggsj") pod "bb4b7df7-18ee-4c71-b7a3-56d799f45bf9" (UID: "bb4b7df7-18ee-4c71-b7a3-56d799f45bf9"). InnerVolumeSpecName "kube-api-access-kggsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.821934 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4mxz\" (UniqueName: \"kubernetes.io/projected/90114ff5-5dc3-4755-be92-df3f1f7a12f0-kube-api-access-s4mxz\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:35 crc kubenswrapper[4780]: I0929 19:00:35.821979 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kggsj\" (UniqueName: \"kubernetes.io/projected/bb4b7df7-18ee-4c71-b7a3-56d799f45bf9-kube-api-access-kggsj\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:36 crc kubenswrapper[4780]: I0929 19:00:36.294240 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b644-account-create-4hb8n" event={"ID":"90114ff5-5dc3-4755-be92-df3f1f7a12f0","Type":"ContainerDied","Data":"14b43c7feaf0fab1136533e3fc3f4331b56a01a990d8aeefb0c071d1704336fd"} Sep 29 19:00:36 crc kubenswrapper[4780]: I0929 19:00:36.294629 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b43c7feaf0fab1136533e3fc3f4331b56a01a990d8aeefb0c071d1704336fd" Sep 29 19:00:36 crc kubenswrapper[4780]: I0929 19:00:36.294475 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b644-account-create-4hb8n" Sep 29 19:00:36 crc kubenswrapper[4780]: I0929 19:00:36.296783 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fda4-account-create-xncgz" event={"ID":"bb4b7df7-18ee-4c71-b7a3-56d799f45bf9","Type":"ContainerDied","Data":"b1133afe88247b5d0e1ab887d25783b2e5c1fec73b0df8877852e22acb50023e"} Sep 29 19:00:36 crc kubenswrapper[4780]: I0929 19:00:36.296807 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1133afe88247b5d0e1ab887d25783b2e5c1fec73b0df8877852e22acb50023e" Sep 29 19:00:36 crc kubenswrapper[4780]: I0929 19:00:36.296838 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fda4-account-create-xncgz" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.831207 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rvtnb"] Sep 29 19:00:37 crc kubenswrapper[4780]: E0929 19:00:37.833584 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81be594-85ee-4be4-8afd-ec5580651ec7" containerName="mariadb-account-create" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.833696 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81be594-85ee-4be4-8afd-ec5580651ec7" containerName="mariadb-account-create" Sep 29 19:00:37 crc kubenswrapper[4780]: E0929 19:00:37.833785 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90114ff5-5dc3-4755-be92-df3f1f7a12f0" containerName="mariadb-account-create" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.833848 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="90114ff5-5dc3-4755-be92-df3f1f7a12f0" containerName="mariadb-account-create" Sep 29 19:00:37 crc kubenswrapper[4780]: E0929 19:00:37.833922 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4b7df7-18ee-4c71-b7a3-56d799f45bf9" containerName="mariadb-account-create" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.834201 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4b7df7-18ee-4c71-b7a3-56d799f45bf9" containerName="mariadb-account-create" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.834475 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81be594-85ee-4be4-8afd-ec5580651ec7" containerName="mariadb-account-create" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.834600 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="90114ff5-5dc3-4755-be92-df3f1f7a12f0" containerName="mariadb-account-create" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.834674 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4b7df7-18ee-4c71-b7a3-56d799f45bf9" containerName="mariadb-account-create" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.835518 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.867191 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.867464 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-w474c" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.872315 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rvtnb"] Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.971806 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5r4w\" (UniqueName: \"kubernetes.io/projected/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-kube-api-access-x5r4w\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.971841 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-db-sync-config-data\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.971872 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-config-data\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:37 crc kubenswrapper[4780]: I0929 19:00:37.971909 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-combined-ca-bundle\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.073588 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5r4w\" (UniqueName: \"kubernetes.io/projected/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-kube-api-access-x5r4w\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.073638 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-db-sync-config-data\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.073686 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-config-data\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.073748 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-combined-ca-bundle\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.083369 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-db-sync-config-data\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.083461 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-combined-ca-bundle\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.084011 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-config-data\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.098605 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5r4w\" (UniqueName: \"kubernetes.io/projected/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-kube-api-access-x5r4w\") pod \"glance-db-sync-rvtnb\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.247749 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvtnb" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.322772 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09"} Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.322830 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311"} Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.322857 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf"} Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.647944 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hzb5x" podUID="91a8fa86-9475-490a-9c9f-09233413eab5" containerName="ovn-controller" probeResult="failure" output=< Sep 29 19:00:38 crc kubenswrapper[4780]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 29 19:00:38 crc kubenswrapper[4780]: > Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.779627 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.781016 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 19:00:38 crc kubenswrapper[4780]: I0929 19:00:38.810702 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rvtnb"] Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.009932 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hzb5x-config-zmj64"] Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.018387 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.020644 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hzb5x-config-zmj64"] Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.021131 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.192845 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-log-ovn\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.192906 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-additional-scripts\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.192943 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run-ovn\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.193030 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dchgb\" (UniqueName: \"kubernetes.io/projected/c3205890-0fee-4435-8cea-55fc69b41849-kube-api-access-dchgb\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.193062 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.193087 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-scripts\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.294917 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dchgb\" (UniqueName: \"kubernetes.io/projected/c3205890-0fee-4435-8cea-55fc69b41849-kube-api-access-dchgb\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.294963 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.294996 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-scripts\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.295069 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-log-ovn\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.295107 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-additional-scripts\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.295136 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run-ovn\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.295369 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run-ovn\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.295364 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.295414 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-log-ovn\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.296998 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-additional-scripts\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.299242 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-scripts\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.318203 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dchgb\" (UniqueName: \"kubernetes.io/projected/c3205890-0fee-4435-8cea-55fc69b41849-kube-api-access-dchgb\") pod \"ovn-controller-hzb5x-config-zmj64\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.336605 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435"} Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.339544 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvtnb" event={"ID":"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1","Type":"ContainerStarted","Data":"1b359d29b1fdd3296b851acd03710b7b023663fcb7ebca9157d905b5a5173538"} Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.345522 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:39 crc kubenswrapper[4780]: I0929 19:00:39.900691 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hzb5x-config-zmj64"] Sep 29 19:00:39 crc kubenswrapper[4780]: W0929 19:00:39.910730 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3205890_0fee_4435_8cea_55fc69b41849.slice/crio-6ef340123751827ea06518c1493fc4cd33bfc87cb61204981501b22ed1c3ccaa WatchSource:0}: Error finding container 6ef340123751827ea06518c1493fc4cd33bfc87cb61204981501b22ed1c3ccaa: Status 404 returned error can't find the container with id 6ef340123751827ea06518c1493fc4cd33bfc87cb61204981501b22ed1c3ccaa Sep 29 19:00:40 crc kubenswrapper[4780]: I0929 19:00:40.350611 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x-config-zmj64" event={"ID":"c3205890-0fee-4435-8cea-55fc69b41849","Type":"ContainerStarted","Data":"6ef340123751827ea06518c1493fc4cd33bfc87cb61204981501b22ed1c3ccaa"} Sep 29 19:00:40 crc kubenswrapper[4780]: I0929 19:00:40.356123 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b"} Sep 29 19:00:40 crc kubenswrapper[4780]: I0929 19:00:40.356156 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6"} Sep 29 19:00:40 crc kubenswrapper[4780]: I0929 19:00:40.356207 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8"} Sep 29 19:00:41 crc kubenswrapper[4780]: I0929 19:00:41.377870 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x-config-zmj64" event={"ID":"c3205890-0fee-4435-8cea-55fc69b41849","Type":"ContainerStarted","Data":"35e5b426f921a88251cd1fd6a84ce25592033487febb99da7b9b4dcfe094b1cb"} Sep 29 19:00:41 crc kubenswrapper[4780]: I0929 19:00:41.382920 4780 generic.go:334] "Generic (PLEG): container finished" podID="d2ee2741-9417-4698-b550-7c596d00d271" containerID="9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3" exitCode=0 Sep 29 19:00:41 crc kubenswrapper[4780]: I0929 19:00:41.383009 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ee2741-9417-4698-b550-7c596d00d271","Type":"ContainerDied","Data":"9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3"} Sep 29 19:00:41 crc kubenswrapper[4780]: I0929 19:00:41.396423 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2"} Sep 29 19:00:41 crc kubenswrapper[4780]: I0929 19:00:41.401109 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hzb5x-config-zmj64" podStartSLOduration=3.401094949 podStartE2EDuration="3.401094949s" podCreationTimestamp="2025-09-29 19:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:00:41.397555485 +0000 UTC m=+1041.345853529" watchObservedRunningTime="2025-09-29 19:00:41.401094949 +0000 UTC m=+1041.349392993" Sep 29 19:00:41 crc kubenswrapper[4780]: I0929 19:00:41.409742 4780 generic.go:334] "Generic (PLEG): container finished" podID="b90472c3-a09d-433c-922b-d164a11636e6" containerID="f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b" exitCode=0 Sep 29 19:00:41 crc kubenswrapper[4780]: I0929 19:00:41.409787 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b90472c3-a09d-433c-922b-d164a11636e6","Type":"ContainerDied","Data":"f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b"} Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.423248 4780 generic.go:334] "Generic (PLEG): container finished" podID="c3205890-0fee-4435-8cea-55fc69b41849" containerID="35e5b426f921a88251cd1fd6a84ce25592033487febb99da7b9b4dcfe094b1cb" exitCode=0 Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.423490 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x-config-zmj64" event={"ID":"c3205890-0fee-4435-8cea-55fc69b41849","Type":"ContainerDied","Data":"35e5b426f921a88251cd1fd6a84ce25592033487febb99da7b9b4dcfe094b1cb"} Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.428815 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ee2741-9417-4698-b550-7c596d00d271","Type":"ContainerStarted","Data":"a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac"} Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.429015 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.449719 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a"} Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.449764 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea"} Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.449779 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerStarted","Data":"967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5"} Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.453192 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b90472c3-a09d-433c-922b-d164a11636e6","Type":"ContainerStarted","Data":"0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0"} Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.454114 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.477867 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=34.037148882 podStartE2EDuration="1m5.477845899s" podCreationTimestamp="2025-09-29 18:59:37 +0000 UTC" firstStartedPulling="2025-09-29 18:59:39.007999868 +0000 UTC m=+978.956297912" lastFinishedPulling="2025-09-29 19:00:10.448696895 +0000 UTC m=+1010.396994929" observedRunningTime="2025-09-29 19:00:42.466251358 +0000 UTC m=+1042.414549422" watchObservedRunningTime="2025-09-29 19:00:42.477845899 +0000 UTC m=+1042.426143943" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.506039 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.501533612 podStartE2EDuration="28.506022417s" podCreationTimestamp="2025-09-29 19:00:14 +0000 UTC" firstStartedPulling="2025-09-29 19:00:32.461853525 +0000 UTC m=+1032.410151569" lastFinishedPulling="2025-09-29 19:00:39.46634233 +0000 UTC m=+1039.414640374" observedRunningTime="2025-09-29 19:00:42.503539004 +0000 UTC m=+1042.451837048" watchObservedRunningTime="2025-09-29 19:00:42.506022417 +0000 UTC m=+1042.454320461" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.537728 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.819377085 podStartE2EDuration="1m5.537709878s" podCreationTimestamp="2025-09-29 18:59:37 +0000 UTC" firstStartedPulling="2025-09-29 18:59:39.510227831 +0000 UTC m=+979.458525875" lastFinishedPulling="2025-09-29 19:00:10.228560624 +0000 UTC m=+1010.176858668" observedRunningTime="2025-09-29 19:00:42.531562358 +0000 UTC m=+1042.479860422" watchObservedRunningTime="2025-09-29 19:00:42.537709878 +0000 UTC m=+1042.486007912" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.778177 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-csl6f"] Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.780648 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.783701 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.804939 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-csl6f"] Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.880841 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-config\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.880905 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.880990 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.881015 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mr8m\" (UniqueName: \"kubernetes.io/projected/3b4750fb-fa69-4c76-b3c8-8c250e933533-kube-api-access-5mr8m\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.881081 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.881110 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-svc\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.982749 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-svc\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.982853 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-config\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.982873 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.982923 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.982942 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mr8m\" (UniqueName: \"kubernetes.io/projected/3b4750fb-fa69-4c76-b3c8-8c250e933533-kube-api-access-5mr8m\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.982974 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.983812 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-nb\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.984385 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-svc\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.984848 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-config\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.984871 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-sb\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:42 crc kubenswrapper[4780]: I0929 19:00:42.985164 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-swift-storage-0\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.011201 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mr8m\" (UniqueName: \"kubernetes.io/projected/3b4750fb-fa69-4c76-b3c8-8c250e933533-kube-api-access-5mr8m\") pod \"dnsmasq-dns-6cfbb96789-csl6f\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.099392 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.634410 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-csl6f"] Sep 29 19:00:43 crc kubenswrapper[4780]: W0929 19:00:43.647288 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b4750fb_fa69_4c76_b3c8_8c250e933533.slice/crio-9ade6f081c1319fadb048effcb960cc927af1dff4c66be5b4816c4704ce1e880 WatchSource:0}: Error finding container 9ade6f081c1319fadb048effcb960cc927af1dff4c66be5b4816c4704ce1e880: Status 404 returned error can't find the container with id 9ade6f081c1319fadb048effcb960cc927af1dff4c66be5b4816c4704ce1e880 Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.683575 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hzb5x" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.807661 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.896671 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dchgb\" (UniqueName: \"kubernetes.io/projected/c3205890-0fee-4435-8cea-55fc69b41849-kube-api-access-dchgb\") pod \"c3205890-0fee-4435-8cea-55fc69b41849\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.896740 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run-ovn\") pod \"c3205890-0fee-4435-8cea-55fc69b41849\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.896813 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-log-ovn\") pod \"c3205890-0fee-4435-8cea-55fc69b41849\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.896874 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c3205890-0fee-4435-8cea-55fc69b41849" (UID: "c3205890-0fee-4435-8cea-55fc69b41849"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.896889 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-scripts\") pod \"c3205890-0fee-4435-8cea-55fc69b41849\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.897018 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-additional-scripts\") pod \"c3205890-0fee-4435-8cea-55fc69b41849\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.897104 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run\") pod \"c3205890-0fee-4435-8cea-55fc69b41849\" (UID: \"c3205890-0fee-4435-8cea-55fc69b41849\") " Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.897856 4780 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.897889 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run" (OuterVolumeSpecName: "var-run") pod "c3205890-0fee-4435-8cea-55fc69b41849" (UID: "c3205890-0fee-4435-8cea-55fc69b41849"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.897915 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c3205890-0fee-4435-8cea-55fc69b41849" (UID: "c3205890-0fee-4435-8cea-55fc69b41849"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.898345 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c3205890-0fee-4435-8cea-55fc69b41849" (UID: "c3205890-0fee-4435-8cea-55fc69b41849"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.898500 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-scripts" (OuterVolumeSpecName: "scripts") pod "c3205890-0fee-4435-8cea-55fc69b41849" (UID: "c3205890-0fee-4435-8cea-55fc69b41849"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:43 crc kubenswrapper[4780]: I0929 19:00:43.902109 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3205890-0fee-4435-8cea-55fc69b41849-kube-api-access-dchgb" (OuterVolumeSpecName: "kube-api-access-dchgb") pod "c3205890-0fee-4435-8cea-55fc69b41849" (UID: "c3205890-0fee-4435-8cea-55fc69b41849"). InnerVolumeSpecName "kube-api-access-dchgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:43.999827 4780 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.000175 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.000187 4780 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3205890-0fee-4435-8cea-55fc69b41849-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.000197 4780 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3205890-0fee-4435-8cea-55fc69b41849-var-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.000206 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dchgb\" (UniqueName: \"kubernetes.io/projected/c3205890-0fee-4435-8cea-55fc69b41849-kube-api-access-dchgb\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.471623 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x-config-zmj64" event={"ID":"c3205890-0fee-4435-8cea-55fc69b41849","Type":"ContainerDied","Data":"6ef340123751827ea06518c1493fc4cd33bfc87cb61204981501b22ed1c3ccaa"} Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.471671 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ef340123751827ea06518c1493fc4cd33bfc87cb61204981501b22ed1c3ccaa" Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.471746 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x-config-zmj64" Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.486923 4780 generic.go:334] "Generic (PLEG): container finished" podID="3b4750fb-fa69-4c76-b3c8-8c250e933533" containerID="929a871ab45437d0084ba2f48bdddf59853c3cd5746929435fcc27ba0b9d3dd9" exitCode=0 Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.487026 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" event={"ID":"3b4750fb-fa69-4c76-b3c8-8c250e933533","Type":"ContainerDied","Data":"929a871ab45437d0084ba2f48bdddf59853c3cd5746929435fcc27ba0b9d3dd9"} Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.487150 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" event={"ID":"3b4750fb-fa69-4c76-b3c8-8c250e933533","Type":"ContainerStarted","Data":"9ade6f081c1319fadb048effcb960cc927af1dff4c66be5b4816c4704ce1e880"} Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.906470 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hzb5x-config-zmj64"] Sep 29 19:00:44 crc kubenswrapper[4780]: I0929 19:00:44.919075 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hzb5x-config-zmj64"] Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.003646 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hzb5x-config-5pmh5"] Sep 29 19:00:45 crc kubenswrapper[4780]: E0929 19:00:45.004032 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3205890-0fee-4435-8cea-55fc69b41849" containerName="ovn-config" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.004062 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3205890-0fee-4435-8cea-55fc69b41849" containerName="ovn-config" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.004233 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3205890-0fee-4435-8cea-55fc69b41849" containerName="ovn-config" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.005081 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.006930 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.016317 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hzb5x-config-5pmh5"] Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.020956 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjqvm\" (UniqueName: \"kubernetes.io/projected/a03266c1-544f-4abb-a2f0-88e077d77e0f-kube-api-access-kjqvm\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.021014 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-additional-scripts\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.021081 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.021111 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-log-ovn\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.021186 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-scripts\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.021225 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run-ovn\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.122798 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjqvm\" (UniqueName: \"kubernetes.io/projected/a03266c1-544f-4abb-a2f0-88e077d77e0f-kube-api-access-kjqvm\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.122855 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-additional-scripts\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.122909 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.122940 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-log-ovn\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.123011 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-scripts\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.123057 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run-ovn\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.123312 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.123322 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-log-ovn\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.123328 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run-ovn\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.123656 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-additional-scripts\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.125429 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-scripts\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.144959 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjqvm\" (UniqueName: \"kubernetes.io/projected/a03266c1-544f-4abb-a2f0-88e077d77e0f-kube-api-access-kjqvm\") pod \"ovn-controller-hzb5x-config-5pmh5\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:45 crc kubenswrapper[4780]: I0929 19:00:45.322115 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:46 crc kubenswrapper[4780]: I0929 19:00:46.763592 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3205890-0fee-4435-8cea-55fc69b41849" path="/var/lib/kubelet/pods/c3205890-0fee-4435-8cea-55fc69b41849/volumes" Sep 29 19:00:55 crc kubenswrapper[4780]: E0929 19:00:55.048775 4780 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070" Sep 29 19:00:55 crc kubenswrapper[4780]: E0929 19:00:55.049856 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x5r4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-rvtnb_openstack(91d3cfe6-96f0-442a-aa5d-8a08ff10eed1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 19:00:55 crc kubenswrapper[4780]: E0929 19:00:55.051104 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-rvtnb" podUID="91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" Sep 29 19:00:55 crc kubenswrapper[4780]: I0929 19:00:55.466036 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hzb5x-config-5pmh5"] Sep 29 19:00:55 crc kubenswrapper[4780]: W0929 19:00:55.474293 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda03266c1_544f_4abb_a2f0_88e077d77e0f.slice/crio-3960b9e4a1015c264c7a9f8860064974c26080b0a1020a78fd93200bce301b32 WatchSource:0}: Error finding container 3960b9e4a1015c264c7a9f8860064974c26080b0a1020a78fd93200bce301b32: Status 404 returned error can't find the container with id 3960b9e4a1015c264c7a9f8860064974c26080b0a1020a78fd93200bce301b32 Sep 29 19:00:55 crc kubenswrapper[4780]: I0929 19:00:55.600995 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" event={"ID":"3b4750fb-fa69-4c76-b3c8-8c250e933533","Type":"ContainerStarted","Data":"4a56d8c6ff3bd869bd7453e6297a3217d7fae5390ca20a21c7949b36f190b2fb"} Sep 29 19:00:55 crc kubenswrapper[4780]: I0929 19:00:55.601525 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:00:55 crc kubenswrapper[4780]: I0929 19:00:55.603161 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x-config-5pmh5" event={"ID":"a03266c1-544f-4abb-a2f0-88e077d77e0f","Type":"ContainerStarted","Data":"3960b9e4a1015c264c7a9f8860064974c26080b0a1020a78fd93200bce301b32"} Sep 29 19:00:55 crc kubenswrapper[4780]: E0929 19:00:55.605533 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:bdfed2a176a064bf70082602a1f319eace2d9003ff1117b1e48b7f2130840070\\\"\"" pod="openstack/glance-db-sync-rvtnb" podUID="91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" Sep 29 19:00:55 crc kubenswrapper[4780]: I0929 19:00:55.634369 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" podStartSLOduration=13.634352602 podStartE2EDuration="13.634352602s" podCreationTimestamp="2025-09-29 19:00:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:00:55.630335404 +0000 UTC m=+1055.578633448" watchObservedRunningTime="2025-09-29 19:00:55.634352602 +0000 UTC m=+1055.582650646" Sep 29 19:00:56 crc kubenswrapper[4780]: I0929 19:00:56.623024 4780 generic.go:334] "Generic (PLEG): container finished" podID="a03266c1-544f-4abb-a2f0-88e077d77e0f" containerID="d83a512957c0253e871e9228adf6c0ea9b12c0719bac9544b8bf3ce9dc88c419" exitCode=0 Sep 29 19:00:56 crc kubenswrapper[4780]: I0929 19:00:56.623110 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x-config-5pmh5" event={"ID":"a03266c1-544f-4abb-a2f0-88e077d77e0f","Type":"ContainerDied","Data":"d83a512957c0253e871e9228adf6c0ea9b12c0719bac9544b8bf3ce9dc88c419"} Sep 29 19:00:57 crc kubenswrapper[4780]: I0929 19:00:57.931549 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.063397 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-log-ovn\") pod \"a03266c1-544f-4abb-a2f0-88e077d77e0f\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.063483 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjqvm\" (UniqueName: \"kubernetes.io/projected/a03266c1-544f-4abb-a2f0-88e077d77e0f-kube-api-access-kjqvm\") pod \"a03266c1-544f-4abb-a2f0-88e077d77e0f\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.063518 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-scripts\") pod \"a03266c1-544f-4abb-a2f0-88e077d77e0f\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.063527 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a03266c1-544f-4abb-a2f0-88e077d77e0f" (UID: "a03266c1-544f-4abb-a2f0-88e077d77e0f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.063676 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run-ovn\") pod \"a03266c1-544f-4abb-a2f0-88e077d77e0f\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.063700 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-additional-scripts\") pod \"a03266c1-544f-4abb-a2f0-88e077d77e0f\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.063763 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a03266c1-544f-4abb-a2f0-88e077d77e0f" (UID: "a03266c1-544f-4abb-a2f0-88e077d77e0f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.063787 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run\") pod \"a03266c1-544f-4abb-a2f0-88e077d77e0f\" (UID: \"a03266c1-544f-4abb-a2f0-88e077d77e0f\") " Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.063872 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run" (OuterVolumeSpecName: "var-run") pod "a03266c1-544f-4abb-a2f0-88e077d77e0f" (UID: "a03266c1-544f-4abb-a2f0-88e077d77e0f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.064530 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a03266c1-544f-4abb-a2f0-88e077d77e0f" (UID: "a03266c1-544f-4abb-a2f0-88e077d77e0f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.064659 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-scripts" (OuterVolumeSpecName: "scripts") pod "a03266c1-544f-4abb-a2f0-88e077d77e0f" (UID: "a03266c1-544f-4abb-a2f0-88e077d77e0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.064821 4780 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.064839 4780 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.064850 4780 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.064860 4780 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a03266c1-544f-4abb-a2f0-88e077d77e0f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.064869 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a03266c1-544f-4abb-a2f0-88e077d77e0f-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.071216 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a03266c1-544f-4abb-a2f0-88e077d77e0f-kube-api-access-kjqvm" (OuterVolumeSpecName: "kube-api-access-kjqvm") pod "a03266c1-544f-4abb-a2f0-88e077d77e0f" (UID: "a03266c1-544f-4abb-a2f0-88e077d77e0f"). InnerVolumeSpecName "kube-api-access-kjqvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.167015 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjqvm\" (UniqueName: \"kubernetes.io/projected/a03266c1-544f-4abb-a2f0-88e077d77e0f-kube-api-access-kjqvm\") on node \"crc\" DevicePath \"\"" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.442241 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.647231 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x-config-5pmh5" event={"ID":"a03266c1-544f-4abb-a2f0-88e077d77e0f","Type":"ContainerDied","Data":"3960b9e4a1015c264c7a9f8860064974c26080b0a1020a78fd93200bce301b32"} Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.647280 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3960b9e4a1015c264c7a9f8860064974c26080b0a1020a78fd93200bce301b32" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.647342 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x-config-5pmh5" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.724123 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ws6nx"] Sep 29 19:00:58 crc kubenswrapper[4780]: E0929 19:00:58.724716 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03266c1-544f-4abb-a2f0-88e077d77e0f" containerName="ovn-config" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.724774 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03266c1-544f-4abb-a2f0-88e077d77e0f" containerName="ovn-config" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.725013 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a03266c1-544f-4abb-a2f0-88e077d77e0f" containerName="ovn-config" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.725789 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ws6nx" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.809761 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ws6nx"] Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.883954 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-4dm25"] Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.898591 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hbbk\" (UniqueName: \"kubernetes.io/projected/1fc9cbc9-e264-4eba-81ff-38dbb8b126ad-kube-api-access-5hbbk\") pod \"barbican-db-create-ws6nx\" (UID: \"1fc9cbc9-e264-4eba-81ff-38dbb8b126ad\") " pod="openstack/barbican-db-create-ws6nx" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.902761 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4dm25" Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.905329 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4dm25"] Sep 29 19:00:58 crc kubenswrapper[4780]: I0929 19:00:58.926470 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.000723 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hbbk\" (UniqueName: \"kubernetes.io/projected/1fc9cbc9-e264-4eba-81ff-38dbb8b126ad-kube-api-access-5hbbk\") pod \"barbican-db-create-ws6nx\" (UID: \"1fc9cbc9-e264-4eba-81ff-38dbb8b126ad\") " pod="openstack/barbican-db-create-ws6nx" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.000948 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv8rl\" (UniqueName: \"kubernetes.io/projected/fc259cf2-313d-4bbf-add3-3df206332827-kube-api-access-rv8rl\") pod \"cinder-db-create-4dm25\" (UID: \"fc259cf2-313d-4bbf-add3-3df206332827\") " pod="openstack/cinder-db-create-4dm25" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.055878 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hbbk\" (UniqueName: \"kubernetes.io/projected/1fc9cbc9-e264-4eba-81ff-38dbb8b126ad-kube-api-access-5hbbk\") pod \"barbican-db-create-ws6nx\" (UID: \"1fc9cbc9-e264-4eba-81ff-38dbb8b126ad\") " pod="openstack/barbican-db-create-ws6nx" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.064026 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ws6nx" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.102445 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv8rl\" (UniqueName: \"kubernetes.io/projected/fc259cf2-313d-4bbf-add3-3df206332827-kube-api-access-rv8rl\") pod \"cinder-db-create-4dm25\" (UID: \"fc259cf2-313d-4bbf-add3-3df206332827\") " pod="openstack/cinder-db-create-4dm25" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.108315 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hzb5x-config-5pmh5"] Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.114683 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hzb5x-config-5pmh5"] Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.127938 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv8rl\" (UniqueName: \"kubernetes.io/projected/fc259cf2-313d-4bbf-add3-3df206332827-kube-api-access-rv8rl\") pod \"cinder-db-create-4dm25\" (UID: \"fc259cf2-313d-4bbf-add3-3df206332827\") " pod="openstack/cinder-db-create-4dm25" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.143943 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rwgns"] Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.145431 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rwgns" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.156712 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rwgns"] Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.170763 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-v9wvf"] Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.171971 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.174813 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.175197 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.175356 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.176135 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8gg79" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.181300 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v9wvf"] Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.227129 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4dm25" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.308293 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-config-data\") pod \"keystone-db-sync-v9wvf\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.308924 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56j8\" (UniqueName: \"kubernetes.io/projected/6c94ee7f-d255-4413-817d-77759b7d0e80-kube-api-access-w56j8\") pod \"keystone-db-sync-v9wvf\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.309001 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fk84\" (UniqueName: \"kubernetes.io/projected/f0c53ce3-391e-424f-ac39-6c7b49502aa2-kube-api-access-5fk84\") pod \"neutron-db-create-rwgns\" (UID: \"f0c53ce3-391e-424f-ac39-6c7b49502aa2\") " pod="openstack/neutron-db-create-rwgns" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.309039 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-combined-ca-bundle\") pod \"keystone-db-sync-v9wvf\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.411409 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fk84\" (UniqueName: \"kubernetes.io/projected/f0c53ce3-391e-424f-ac39-6c7b49502aa2-kube-api-access-5fk84\") pod \"neutron-db-create-rwgns\" (UID: \"f0c53ce3-391e-424f-ac39-6c7b49502aa2\") " pod="openstack/neutron-db-create-rwgns" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.411500 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-combined-ca-bundle\") pod \"keystone-db-sync-v9wvf\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.411561 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-config-data\") pod \"keystone-db-sync-v9wvf\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.411611 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w56j8\" (UniqueName: \"kubernetes.io/projected/6c94ee7f-d255-4413-817d-77759b7d0e80-kube-api-access-w56j8\") pod \"keystone-db-sync-v9wvf\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.423260 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-combined-ca-bundle\") pod \"keystone-db-sync-v9wvf\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.423380 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-config-data\") pod \"keystone-db-sync-v9wvf\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:00:59 crc kubenswrapper[4780]: I0929 19:00:59.431337 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56j8\" (UniqueName: \"kubernetes.io/projected/6c94ee7f-d255-4413-817d-77759b7d0e80-kube-api-access-w56j8\") pod \"keystone-db-sync-v9wvf\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:00:59.436252 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fk84\" (UniqueName: \"kubernetes.io/projected/f0c53ce3-391e-424f-ac39-6c7b49502aa2-kube-api-access-5fk84\") pod \"neutron-db-create-rwgns\" (UID: \"f0c53ce3-391e-424f-ac39-6c7b49502aa2\") " pod="openstack/neutron-db-create-rwgns" Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:00:59.513178 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rwgns" Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:00:59.528207 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:00:59.648417 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ws6nx"] Sep 29 19:01:00 crc kubenswrapper[4780]: W0929 19:00:59.657448 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc9cbc9_e264_4eba_81ff_38dbb8b126ad.slice/crio-ee64f016f8791a29e26d7cf0010686e33c0bcd68d7a63ee99fba6629dd845482 WatchSource:0}: Error finding container ee64f016f8791a29e26d7cf0010686e33c0bcd68d7a63ee99fba6629dd845482: Status 404 returned error can't find the container with id ee64f016f8791a29e26d7cf0010686e33c0bcd68d7a63ee99fba6629dd845482 Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.322249 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4dm25"] Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.558212 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v9wvf"] Sep 29 19:01:00 crc kubenswrapper[4780]: W0929 19:01:00.560367 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c94ee7f_d255_4413_817d_77759b7d0e80.slice/crio-2a563c84e526f7c9eff22c5c63e35e07fda3e49c8251761ab3abdc9d9eef0fe4 WatchSource:0}: Error finding container 2a563c84e526f7c9eff22c5c63e35e07fda3e49c8251761ab3abdc9d9eef0fe4: Status 404 returned error can't find the container with id 2a563c84e526f7c9eff22c5c63e35e07fda3e49c8251761ab3abdc9d9eef0fe4 Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.583588 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rwgns"] Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.668386 4780 generic.go:334] "Generic (PLEG): container finished" podID="fc259cf2-313d-4bbf-add3-3df206332827" containerID="13e8ab18182942ec7c1e17fb68cc0e8c23dbfb081acf5053cae6c70e5e243ff9" exitCode=0 Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.668549 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4dm25" event={"ID":"fc259cf2-313d-4bbf-add3-3df206332827","Type":"ContainerDied","Data":"13e8ab18182942ec7c1e17fb68cc0e8c23dbfb081acf5053cae6c70e5e243ff9"} Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.668631 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4dm25" event={"ID":"fc259cf2-313d-4bbf-add3-3df206332827","Type":"ContainerStarted","Data":"e0e52f967f6018b77d1bfe9dcc5ec044ab8de7266770c3b8788ab6270c45df19"} Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.670477 4780 generic.go:334] "Generic (PLEG): container finished" podID="1fc9cbc9-e264-4eba-81ff-38dbb8b126ad" containerID="dd976228546b3979a349b28fd87ae9da5e203094d0bc0353d3f2409a6ca2e748" exitCode=0 Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.670554 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ws6nx" event={"ID":"1fc9cbc9-e264-4eba-81ff-38dbb8b126ad","Type":"ContainerDied","Data":"dd976228546b3979a349b28fd87ae9da5e203094d0bc0353d3f2409a6ca2e748"} Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.670619 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ws6nx" event={"ID":"1fc9cbc9-e264-4eba-81ff-38dbb8b126ad","Type":"ContainerStarted","Data":"ee64f016f8791a29e26d7cf0010686e33c0bcd68d7a63ee99fba6629dd845482"} Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.671657 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v9wvf" event={"ID":"6c94ee7f-d255-4413-817d-77759b7d0e80","Type":"ContainerStarted","Data":"2a563c84e526f7c9eff22c5c63e35e07fda3e49c8251761ab3abdc9d9eef0fe4"} Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.672805 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rwgns" event={"ID":"f0c53ce3-391e-424f-ac39-6c7b49502aa2","Type":"ContainerStarted","Data":"97632d790bab1bfcd90941c54ed10463c1af1892c9b0de9ea096ba7d703258b4"} Sep 29 19:01:00 crc kubenswrapper[4780]: I0929 19:01:00.783846 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a03266c1-544f-4abb-a2f0-88e077d77e0f" path="/var/lib/kubelet/pods/a03266c1-544f-4abb-a2f0-88e077d77e0f/volumes" Sep 29 19:01:01 crc kubenswrapper[4780]: I0929 19:01:01.686925 4780 generic.go:334] "Generic (PLEG): container finished" podID="f0c53ce3-391e-424f-ac39-6c7b49502aa2" containerID="d5e0636251e3ed27e8f1f18c1f0cea8ea6425de2eb38557afac065bde4e57202" exitCode=0 Sep 29 19:01:01 crc kubenswrapper[4780]: I0929 19:01:01.687037 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rwgns" event={"ID":"f0c53ce3-391e-424f-ac39-6c7b49502aa2","Type":"ContainerDied","Data":"d5e0636251e3ed27e8f1f18c1f0cea8ea6425de2eb38557afac065bde4e57202"} Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.024600 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4dm25" Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.114001 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ws6nx" Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.178152 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv8rl\" (UniqueName: \"kubernetes.io/projected/fc259cf2-313d-4bbf-add3-3df206332827-kube-api-access-rv8rl\") pod \"fc259cf2-313d-4bbf-add3-3df206332827\" (UID: \"fc259cf2-313d-4bbf-add3-3df206332827\") " Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.184855 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc259cf2-313d-4bbf-add3-3df206332827-kube-api-access-rv8rl" (OuterVolumeSpecName: "kube-api-access-rv8rl") pod "fc259cf2-313d-4bbf-add3-3df206332827" (UID: "fc259cf2-313d-4bbf-add3-3df206332827"). InnerVolumeSpecName "kube-api-access-rv8rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.279597 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hbbk\" (UniqueName: \"kubernetes.io/projected/1fc9cbc9-e264-4eba-81ff-38dbb8b126ad-kube-api-access-5hbbk\") pod \"1fc9cbc9-e264-4eba-81ff-38dbb8b126ad\" (UID: \"1fc9cbc9-e264-4eba-81ff-38dbb8b126ad\") " Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.280318 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv8rl\" (UniqueName: \"kubernetes.io/projected/fc259cf2-313d-4bbf-add3-3df206332827-kube-api-access-rv8rl\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.283396 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc9cbc9-e264-4eba-81ff-38dbb8b126ad-kube-api-access-5hbbk" (OuterVolumeSpecName: "kube-api-access-5hbbk") pod "1fc9cbc9-e264-4eba-81ff-38dbb8b126ad" (UID: "1fc9cbc9-e264-4eba-81ff-38dbb8b126ad"). InnerVolumeSpecName "kube-api-access-5hbbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.382357 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hbbk\" (UniqueName: \"kubernetes.io/projected/1fc9cbc9-e264-4eba-81ff-38dbb8b126ad-kube-api-access-5hbbk\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.698118 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4dm25" event={"ID":"fc259cf2-313d-4bbf-add3-3df206332827","Type":"ContainerDied","Data":"e0e52f967f6018b77d1bfe9dcc5ec044ab8de7266770c3b8788ab6270c45df19"} Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.698137 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4dm25" Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.698152 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0e52f967f6018b77d1bfe9dcc5ec044ab8de7266770c3b8788ab6270c45df19" Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.699708 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ws6nx" event={"ID":"1fc9cbc9-e264-4eba-81ff-38dbb8b126ad","Type":"ContainerDied","Data":"ee64f016f8791a29e26d7cf0010686e33c0bcd68d7a63ee99fba6629dd845482"} Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.699728 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee64f016f8791a29e26d7cf0010686e33c0bcd68d7a63ee99fba6629dd845482" Sep 29 19:01:02 crc kubenswrapper[4780]: I0929 19:01:02.699780 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ws6nx" Sep 29 19:01:03 crc kubenswrapper[4780]: I0929 19:01:03.101417 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:01:03 crc kubenswrapper[4780]: I0929 19:01:03.158446 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-xxb52"] Sep 29 19:01:03 crc kubenswrapper[4780]: I0929 19:01:03.158694 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" podUID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" containerName="dnsmasq-dns" containerID="cri-o://8bd8acd4279307c838824b77e09b9d3989c03328fe506d375edf06758448dbaf" gracePeriod=10 Sep 29 19:01:03 crc kubenswrapper[4780]: I0929 19:01:03.222964 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:01:03 crc kubenswrapper[4780]: I0929 19:01:03.223364 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:01:03 crc kubenswrapper[4780]: I0929 19:01:03.712940 4780 generic.go:334] "Generic (PLEG): container finished" podID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" containerID="8bd8acd4279307c838824b77e09b9d3989c03328fe506d375edf06758448dbaf" exitCode=0 Sep 29 19:01:03 crc kubenswrapper[4780]: I0929 19:01:03.713165 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" event={"ID":"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd","Type":"ContainerDied","Data":"8bd8acd4279307c838824b77e09b9d3989c03328fe506d375edf06758448dbaf"} Sep 29 19:01:04 crc kubenswrapper[4780]: I0929 19:01:04.748793 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" podUID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.719008 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rwgns" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.729696 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.746058 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" event={"ID":"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd","Type":"ContainerDied","Data":"128bce67e17fcc25dd9f5f656be2e30b846f36c46554917117fdc55656ced194"} Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.746099 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c6d5d5bd7-xxb52" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.746128 4780 scope.go:117] "RemoveContainer" containerID="8bd8acd4279307c838824b77e09b9d3989c03328fe506d375edf06758448dbaf" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.747846 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rwgns" event={"ID":"f0c53ce3-391e-424f-ac39-6c7b49502aa2","Type":"ContainerDied","Data":"97632d790bab1bfcd90941c54ed10463c1af1892c9b0de9ea096ba7d703258b4"} Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.747861 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97632d790bab1bfcd90941c54ed10463c1af1892c9b0de9ea096ba7d703258b4" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.747906 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rwgns" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.788989 4780 scope.go:117] "RemoveContainer" containerID="b5e622091d7a9d3aaae2f9c25bf82a0e2605f1d5787caddf95c59d81115f1ece" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.838807 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fk84\" (UniqueName: \"kubernetes.io/projected/f0c53ce3-391e-424f-ac39-6c7b49502aa2-kube-api-access-5fk84\") pod \"f0c53ce3-391e-424f-ac39-6c7b49502aa2\" (UID: \"f0c53ce3-391e-424f-ac39-6c7b49502aa2\") " Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.838921 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-config\") pod \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.838969 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phm52\" (UniqueName: \"kubernetes.io/projected/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-kube-api-access-phm52\") pod \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.838987 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-sb\") pod \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.839104 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-dns-svc\") pod \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.839160 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-nb\") pod \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\" (UID: \"7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd\") " Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.842976 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-kube-api-access-phm52" (OuterVolumeSpecName: "kube-api-access-phm52") pod "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" (UID: "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd"). InnerVolumeSpecName "kube-api-access-phm52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.843196 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0c53ce3-391e-424f-ac39-6c7b49502aa2-kube-api-access-5fk84" (OuterVolumeSpecName: "kube-api-access-5fk84") pod "f0c53ce3-391e-424f-ac39-6c7b49502aa2" (UID: "f0c53ce3-391e-424f-ac39-6c7b49502aa2"). InnerVolumeSpecName "kube-api-access-5fk84". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.906371 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" (UID: "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.907036 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-config" (OuterVolumeSpecName: "config") pod "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" (UID: "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.907483 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" (UID: "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.910697 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" (UID: "7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.942853 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.942907 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phm52\" (UniqueName: \"kubernetes.io/projected/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-kube-api-access-phm52\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.942921 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.942930 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.942940 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:05 crc kubenswrapper[4780]: I0929 19:01:05.942948 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fk84\" (UniqueName: \"kubernetes.io/projected/f0c53ce3-391e-424f-ac39-6c7b49502aa2-kube-api-access-5fk84\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:06 crc kubenswrapper[4780]: I0929 19:01:06.089154 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-xxb52"] Sep 29 19:01:06 crc kubenswrapper[4780]: I0929 19:01:06.097692 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c6d5d5bd7-xxb52"] Sep 29 19:01:06 crc kubenswrapper[4780]: I0929 19:01:06.768635 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" path="/var/lib/kubelet/pods/7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd/volumes" Sep 29 19:01:06 crc kubenswrapper[4780]: I0929 19:01:06.769742 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v9wvf" event={"ID":"6c94ee7f-d255-4413-817d-77759b7d0e80","Type":"ContainerStarted","Data":"a88ecd9869527db3e1e469d6ec77eea8b98e86fb8ec1d9f3b6c7e933897617ef"} Sep 29 19:01:06 crc kubenswrapper[4780]: I0929 19:01:06.794257 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-v9wvf" podStartSLOduration=2.827636245 podStartE2EDuration="7.794228867s" podCreationTimestamp="2025-09-29 19:00:59 +0000 UTC" firstStartedPulling="2025-09-29 19:01:00.562776276 +0000 UTC m=+1060.511074320" lastFinishedPulling="2025-09-29 19:01:05.529368898 +0000 UTC m=+1065.477666942" observedRunningTime="2025-09-29 19:01:06.782808351 +0000 UTC m=+1066.731106425" watchObservedRunningTime="2025-09-29 19:01:06.794228867 +0000 UTC m=+1066.742526921" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.791881 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v9wvf" event={"ID":"6c94ee7f-d255-4413-817d-77759b7d0e80","Type":"ContainerDied","Data":"a88ecd9869527db3e1e469d6ec77eea8b98e86fb8ec1d9f3b6c7e933897617ef"} Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.791750 4780 generic.go:334] "Generic (PLEG): container finished" podID="6c94ee7f-d255-4413-817d-77759b7d0e80" containerID="a88ecd9869527db3e1e469d6ec77eea8b98e86fb8ec1d9f3b6c7e933897617ef" exitCode=0 Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.798422 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-32a4-account-create-dsqns"] Sep 29 19:01:08 crc kubenswrapper[4780]: E0929 19:01:08.798796 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc9cbc9-e264-4eba-81ff-38dbb8b126ad" containerName="mariadb-database-create" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.798809 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc9cbc9-e264-4eba-81ff-38dbb8b126ad" containerName="mariadb-database-create" Sep 29 19:01:08 crc kubenswrapper[4780]: E0929 19:01:08.798821 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c53ce3-391e-424f-ac39-6c7b49502aa2" containerName="mariadb-database-create" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.798827 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c53ce3-391e-424f-ac39-6c7b49502aa2" containerName="mariadb-database-create" Sep 29 19:01:08 crc kubenswrapper[4780]: E0929 19:01:08.798849 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc259cf2-313d-4bbf-add3-3df206332827" containerName="mariadb-database-create" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.798856 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc259cf2-313d-4bbf-add3-3df206332827" containerName="mariadb-database-create" Sep 29 19:01:08 crc kubenswrapper[4780]: E0929 19:01:08.798871 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" containerName="dnsmasq-dns" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.798877 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" containerName="dnsmasq-dns" Sep 29 19:01:08 crc kubenswrapper[4780]: E0929 19:01:08.798885 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" containerName="init" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.798891 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" containerName="init" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.799033 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c53ce3-391e-424f-ac39-6c7b49502aa2" containerName="mariadb-database-create" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.799076 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b83f0b0-d892-4336-a6ea-4dcbe1bf39dd" containerName="dnsmasq-dns" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.799087 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc259cf2-313d-4bbf-add3-3df206332827" containerName="mariadb-database-create" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.799102 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc9cbc9-e264-4eba-81ff-38dbb8b126ad" containerName="mariadb-database-create" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.799751 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-32a4-account-create-dsqns" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.802301 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.806178 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-32a4-account-create-dsqns"] Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.897429 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57pnt\" (UniqueName: \"kubernetes.io/projected/def78cb3-faae-4256-9473-926ce387ca60-kube-api-access-57pnt\") pod \"barbican-32a4-account-create-dsqns\" (UID: \"def78cb3-faae-4256-9473-926ce387ca60\") " pod="openstack/barbican-32a4-account-create-dsqns" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.977974 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a5b9-account-create-9sbx9"] Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.979404 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a5b9-account-create-9sbx9" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.982282 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 29 19:01:08 crc kubenswrapper[4780]: I0929 19:01:08.991611 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a5b9-account-create-9sbx9"] Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.002313 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57pnt\" (UniqueName: \"kubernetes.io/projected/def78cb3-faae-4256-9473-926ce387ca60-kube-api-access-57pnt\") pod \"barbican-32a4-account-create-dsqns\" (UID: \"def78cb3-faae-4256-9473-926ce387ca60\") " pod="openstack/barbican-32a4-account-create-dsqns" Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.023089 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57pnt\" (UniqueName: \"kubernetes.io/projected/def78cb3-faae-4256-9473-926ce387ca60-kube-api-access-57pnt\") pod \"barbican-32a4-account-create-dsqns\" (UID: \"def78cb3-faae-4256-9473-926ce387ca60\") " pod="openstack/barbican-32a4-account-create-dsqns" Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.103778 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw84k\" (UniqueName: \"kubernetes.io/projected/09d5b4b9-e63b-464f-8d39-1fea44ce658c-kube-api-access-xw84k\") pod \"cinder-a5b9-account-create-9sbx9\" (UID: \"09d5b4b9-e63b-464f-8d39-1fea44ce658c\") " pod="openstack/cinder-a5b9-account-create-9sbx9" Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.118342 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-32a4-account-create-dsqns" Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.205268 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw84k\" (UniqueName: \"kubernetes.io/projected/09d5b4b9-e63b-464f-8d39-1fea44ce658c-kube-api-access-xw84k\") pod \"cinder-a5b9-account-create-9sbx9\" (UID: \"09d5b4b9-e63b-464f-8d39-1fea44ce658c\") " pod="openstack/cinder-a5b9-account-create-9sbx9" Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.227823 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw84k\" (UniqueName: \"kubernetes.io/projected/09d5b4b9-e63b-464f-8d39-1fea44ce658c-kube-api-access-xw84k\") pod \"cinder-a5b9-account-create-9sbx9\" (UID: \"09d5b4b9-e63b-464f-8d39-1fea44ce658c\") " pod="openstack/cinder-a5b9-account-create-9sbx9" Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.295591 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a5b9-account-create-9sbx9" Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.564760 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-32a4-account-create-dsqns"] Sep 29 19:01:09 crc kubenswrapper[4780]: W0929 19:01:09.568697 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddef78cb3_faae_4256_9473_926ce387ca60.slice/crio-ad935ee7f555aa63186a4e9c33111ee060f47df895e0270a22a7631485a7dbfd WatchSource:0}: Error finding container ad935ee7f555aa63186a4e9c33111ee060f47df895e0270a22a7631485a7dbfd: Status 404 returned error can't find the container with id ad935ee7f555aa63186a4e9c33111ee060f47df895e0270a22a7631485a7dbfd Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.726231 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a5b9-account-create-9sbx9"] Sep 29 19:01:09 crc kubenswrapper[4780]: W0929 19:01:09.733026 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09d5b4b9_e63b_464f_8d39_1fea44ce658c.slice/crio-e0a5fb003e2406d12ac8a4a47892922c95beb116396ce35ddf8264286731a310 WatchSource:0}: Error finding container e0a5fb003e2406d12ac8a4a47892922c95beb116396ce35ddf8264286731a310: Status 404 returned error can't find the container with id e0a5fb003e2406d12ac8a4a47892922c95beb116396ce35ddf8264286731a310 Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.804171 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvtnb" event={"ID":"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1","Type":"ContainerStarted","Data":"926f9138263106966b8c488fd9a2f55e330d7666c71e67c054a006eecb80d715"} Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.806818 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a5b9-account-create-9sbx9" event={"ID":"09d5b4b9-e63b-464f-8d39-1fea44ce658c","Type":"ContainerStarted","Data":"e0a5fb003e2406d12ac8a4a47892922c95beb116396ce35ddf8264286731a310"} Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.814262 4780 generic.go:334] "Generic (PLEG): container finished" podID="def78cb3-faae-4256-9473-926ce387ca60" containerID="9fe7a9ed34fb1e01649f4cef920a664af9f8e56918c80aa9fd1b9405170031d8" exitCode=0 Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.814344 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-32a4-account-create-dsqns" event={"ID":"def78cb3-faae-4256-9473-926ce387ca60","Type":"ContainerDied","Data":"9fe7a9ed34fb1e01649f4cef920a664af9f8e56918c80aa9fd1b9405170031d8"} Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.814398 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-32a4-account-create-dsqns" event={"ID":"def78cb3-faae-4256-9473-926ce387ca60","Type":"ContainerStarted","Data":"ad935ee7f555aa63186a4e9c33111ee060f47df895e0270a22a7631485a7dbfd"} Sep 29 19:01:09 crc kubenswrapper[4780]: I0929 19:01:09.830867 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rvtnb" podStartSLOduration=2.41325734 podStartE2EDuration="32.830845733s" podCreationTimestamp="2025-09-29 19:00:37 +0000 UTC" firstStartedPulling="2025-09-29 19:00:38.820842167 +0000 UTC m=+1038.769140211" lastFinishedPulling="2025-09-29 19:01:09.23843056 +0000 UTC m=+1069.186728604" observedRunningTime="2025-09-29 19:01:09.820845929 +0000 UTC m=+1069.769144003" watchObservedRunningTime="2025-09-29 19:01:09.830845733 +0000 UTC m=+1069.779143777" Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.084955 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.222991 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-combined-ca-bundle\") pod \"6c94ee7f-d255-4413-817d-77759b7d0e80\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.223445 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w56j8\" (UniqueName: \"kubernetes.io/projected/6c94ee7f-d255-4413-817d-77759b7d0e80-kube-api-access-w56j8\") pod \"6c94ee7f-d255-4413-817d-77759b7d0e80\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.223587 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-config-data\") pod \"6c94ee7f-d255-4413-817d-77759b7d0e80\" (UID: \"6c94ee7f-d255-4413-817d-77759b7d0e80\") " Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.232125 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c94ee7f-d255-4413-817d-77759b7d0e80-kube-api-access-w56j8" (OuterVolumeSpecName: "kube-api-access-w56j8") pod "6c94ee7f-d255-4413-817d-77759b7d0e80" (UID: "6c94ee7f-d255-4413-817d-77759b7d0e80"). InnerVolumeSpecName "kube-api-access-w56j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.260485 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c94ee7f-d255-4413-817d-77759b7d0e80" (UID: "6c94ee7f-d255-4413-817d-77759b7d0e80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.290516 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-config-data" (OuterVolumeSpecName: "config-data") pod "6c94ee7f-d255-4413-817d-77759b7d0e80" (UID: "6c94ee7f-d255-4413-817d-77759b7d0e80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.326459 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.326497 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w56j8\" (UniqueName: \"kubernetes.io/projected/6c94ee7f-d255-4413-817d-77759b7d0e80-kube-api-access-w56j8\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.326516 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c94ee7f-d255-4413-817d-77759b7d0e80-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.848305 4780 generic.go:334] "Generic (PLEG): container finished" podID="09d5b4b9-e63b-464f-8d39-1fea44ce658c" containerID="e9fbd9141d0e45dd131cdf8af3a6a466f887381720e1ef67a10b7d738ced7a6d" exitCode=0 Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.848490 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a5b9-account-create-9sbx9" event={"ID":"09d5b4b9-e63b-464f-8d39-1fea44ce658c","Type":"ContainerDied","Data":"e9fbd9141d0e45dd131cdf8af3a6a466f887381720e1ef67a10b7d738ced7a6d"} Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.853195 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v9wvf" event={"ID":"6c94ee7f-d255-4413-817d-77759b7d0e80","Type":"ContainerDied","Data":"2a563c84e526f7c9eff22c5c63e35e07fda3e49c8251761ab3abdc9d9eef0fe4"} Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.853248 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a563c84e526f7c9eff22c5c63e35e07fda3e49c8251761ab3abdc9d9eef0fe4" Sep 29 19:01:10 crc kubenswrapper[4780]: I0929 19:01:10.853289 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v9wvf" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.061850 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-7c7mp"] Sep 29 19:01:11 crc kubenswrapper[4780]: E0929 19:01:11.062532 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c94ee7f-d255-4413-817d-77759b7d0e80" containerName="keystone-db-sync" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.062556 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c94ee7f-d255-4413-817d-77759b7d0e80" containerName="keystone-db-sync" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.062810 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c94ee7f-d255-4413-817d-77759b7d0e80" containerName="keystone-db-sync" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.063980 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.091341 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-7c7mp"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.142470 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k2fbg"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.143866 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-svc\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.143921 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-swift-storage-0\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.143971 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.144077 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-config\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.144112 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.144136 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5tlf\" (UniqueName: \"kubernetes.io/projected/55b9814e-9fa0-4263-ae85-755a416bb9e3-kube-api-access-p5tlf\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.144324 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.147987 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.148254 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.148422 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.148542 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8gg79" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.157917 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k2fbg"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.175225 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-32a4-account-create-dsqns" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.246191 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-fernet-keys\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.246531 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.246657 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-credential-keys\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.246779 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-scripts\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.246908 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-config\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.247000 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r9fx\" (UniqueName: \"kubernetes.io/projected/fb534a68-12be-4fcf-9e93-de38766353d0-kube-api-access-8r9fx\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.247147 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.247249 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5tlf\" (UniqueName: \"kubernetes.io/projected/55b9814e-9fa0-4263-ae85-755a416bb9e3-kube-api-access-p5tlf\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.247394 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-config-data\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.247534 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-svc\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.247631 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-combined-ca-bundle\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.250231 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-swift-storage-0\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.248733 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.249091 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-config\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.249833 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-svc\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.248402 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.253070 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-swift-storage-0\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.289075 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5tlf\" (UniqueName: \"kubernetes.io/projected/55b9814e-9fa0-4263-ae85-755a416bb9e3-kube-api-access-p5tlf\") pod \"dnsmasq-dns-7f4777664c-7c7mp\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.303233 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:01:11 crc kubenswrapper[4780]: E0929 19:01:11.303618 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def78cb3-faae-4256-9473-926ce387ca60" containerName="mariadb-account-create" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.303634 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="def78cb3-faae-4256-9473-926ce387ca60" containerName="mariadb-account-create" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.303834 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="def78cb3-faae-4256-9473-926ce387ca60" containerName="mariadb-account-create" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.309763 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.312372 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.312649 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.321201 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.353966 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57pnt\" (UniqueName: \"kubernetes.io/projected/def78cb3-faae-4256-9473-926ce387ca60-kube-api-access-57pnt\") pod \"def78cb3-faae-4256-9473-926ce387ca60\" (UID: \"def78cb3-faae-4256-9473-926ce387ca60\") " Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.354605 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-scripts\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.354731 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r9fx\" (UniqueName: \"kubernetes.io/projected/fb534a68-12be-4fcf-9e93-de38766353d0-kube-api-access-8r9fx\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.354856 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-config-data\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.354935 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-combined-ca-bundle\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.355011 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-fernet-keys\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.355140 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-credential-keys\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.359884 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def78cb3-faae-4256-9473-926ce387ca60-kube-api-access-57pnt" (OuterVolumeSpecName: "kube-api-access-57pnt") pod "def78cb3-faae-4256-9473-926ce387ca60" (UID: "def78cb3-faae-4256-9473-926ce387ca60"). InnerVolumeSpecName "kube-api-access-57pnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.362748 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-credential-keys\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.369584 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-scripts\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.377668 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-combined-ca-bundle\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.378566 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-config-data\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.379004 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-fernet-keys\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.391079 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r9fx\" (UniqueName: \"kubernetes.io/projected/fb534a68-12be-4fcf-9e93-de38766353d0-kube-api-access-8r9fx\") pod \"keystone-bootstrap-k2fbg\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.462917 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.463023 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-config-data\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.463069 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkpt\" (UniqueName: \"kubernetes.io/projected/229da81d-301a-46d2-892b-5ac9b0861ac1-kube-api-access-2qkpt\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.463280 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-run-httpd\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.463377 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-scripts\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.463431 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-log-httpd\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.463509 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.464473 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57pnt\" (UniqueName: \"kubernetes.io/projected/def78cb3-faae-4256-9473-926ce387ca60-kube-api-access-57pnt\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.470482 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xmmfb"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.472284 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.481665 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.486068 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.487118 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tmknm" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.487142 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.494472 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xmmfb"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.497228 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.524132 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-7c7mp"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.546270 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-jfqfw"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.552534 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.568689 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-run-httpd\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.568751 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-scripts\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.568772 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-log-httpd\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.568817 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.568871 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.568913 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-config-data\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.568931 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkpt\" (UniqueName: \"kubernetes.io/projected/229da81d-301a-46d2-892b-5ac9b0861ac1-kube-api-access-2qkpt\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.569583 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-run-httpd\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.575397 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-log-httpd\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.578149 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.579904 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-config-data\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.580160 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.590452 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-jfqfw"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.591600 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-scripts\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.598803 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkpt\" (UniqueName: \"kubernetes.io/projected/229da81d-301a-46d2-892b-5ac9b0861ac1-kube-api-access-2qkpt\") pod \"ceilometer-0\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.649549 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.671146 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-svc\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.671252 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-scripts\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.671296 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sw9l\" (UniqueName: \"kubernetes.io/projected/781fb447-43b6-4035-843f-3d51675807bc-kube-api-access-5sw9l\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.671347 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-sb\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.671398 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-config-data\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.671433 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-config\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.671490 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-nb\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.677985 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwfn2\" (UniqueName: \"kubernetes.io/projected/f89293f3-5080-4326-9f2c-7ba9a2f34280-kube-api-access-rwfn2\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.678100 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-swift-storage-0\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.678151 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-combined-ca-bundle\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.678201 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f89293f3-5080-4326-9f2c-7ba9a2f34280-logs\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779701 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sw9l\" (UniqueName: \"kubernetes.io/projected/781fb447-43b6-4035-843f-3d51675807bc-kube-api-access-5sw9l\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779751 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-sb\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779783 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-config-data\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779810 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-config\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779853 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-nb\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779902 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwfn2\" (UniqueName: \"kubernetes.io/projected/f89293f3-5080-4326-9f2c-7ba9a2f34280-kube-api-access-rwfn2\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779922 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-swift-storage-0\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779945 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-combined-ca-bundle\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779967 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f89293f3-5080-4326-9f2c-7ba9a2f34280-logs\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.779995 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-svc\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.780042 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-scripts\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.781011 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f89293f3-5080-4326-9f2c-7ba9a2f34280-logs\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.781079 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-config\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.781196 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-sb\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.784567 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-nb\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.785365 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-swift-storage-0\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.785645 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-config-data\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.787509 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-svc\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.795667 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-scripts\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.803461 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-combined-ca-bundle\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.803695 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sw9l\" (UniqueName: \"kubernetes.io/projected/781fb447-43b6-4035-843f-3d51675807bc-kube-api-access-5sw9l\") pod \"dnsmasq-dns-b4dc449d9-jfqfw\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.807675 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwfn2\" (UniqueName: \"kubernetes.io/projected/f89293f3-5080-4326-9f2c-7ba9a2f34280-kube-api-access-rwfn2\") pod \"placement-db-sync-xmmfb\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.833242 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.856173 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k2fbg"] Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.890180 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-32a4-account-create-dsqns" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.890415 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-32a4-account-create-dsqns" event={"ID":"def78cb3-faae-4256-9473-926ce387ca60","Type":"ContainerDied","Data":"ad935ee7f555aa63186a4e9c33111ee060f47df895e0270a22a7631485a7dbfd"} Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.890454 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad935ee7f555aa63186a4e9c33111ee060f47df895e0270a22a7631485a7dbfd" Sep 29 19:01:11 crc kubenswrapper[4780]: I0929 19:01:11.890753 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:11 crc kubenswrapper[4780]: W0929 19:01:11.892036 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb534a68_12be_4fcf_9e93_de38766353d0.slice/crio-a8dde4c1aee9ce87e4a302da1217db37badd17a0158c1eac03c4b6bb925c806f WatchSource:0}: Error finding container a8dde4c1aee9ce87e4a302da1217db37badd17a0158c1eac03c4b6bb925c806f: Status 404 returned error can't find the container with id a8dde4c1aee9ce87e4a302da1217db37badd17a0158c1eac03c4b6bb925c806f Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.166107 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-7c7mp"] Sep 29 19:01:12 crc kubenswrapper[4780]: W0929 19:01:12.175739 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55b9814e_9fa0_4263_ae85_755a416bb9e3.slice/crio-7a12b0af4164e8f883a6f6de4c2a34997c8c5456b77897a38ae9d93ecd255d09 WatchSource:0}: Error finding container 7a12b0af4164e8f883a6f6de4c2a34997c8c5456b77897a38ae9d93ecd255d09: Status 404 returned error can't find the container with id 7a12b0af4164e8f883a6f6de4c2a34997c8c5456b77897a38ae9d93ecd255d09 Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.269028 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.376282 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a5b9-account-create-9sbx9" Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.379436 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-jfqfw"] Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.494055 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xmmfb"] Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.495496 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw84k\" (UniqueName: \"kubernetes.io/projected/09d5b4b9-e63b-464f-8d39-1fea44ce658c-kube-api-access-xw84k\") pod \"09d5b4b9-e63b-464f-8d39-1fea44ce658c\" (UID: \"09d5b4b9-e63b-464f-8d39-1fea44ce658c\") " Sep 29 19:01:12 crc kubenswrapper[4780]: W0929 19:01:12.499459 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf89293f3_5080_4326_9f2c_7ba9a2f34280.slice/crio-f19019125390cc11389824dbcc0a6a152ac67d1b6695f83cdfb2757e004781b3 WatchSource:0}: Error finding container f19019125390cc11389824dbcc0a6a152ac67d1b6695f83cdfb2757e004781b3: Status 404 returned error can't find the container with id f19019125390cc11389824dbcc0a6a152ac67d1b6695f83cdfb2757e004781b3 Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.504310 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d5b4b9-e63b-464f-8d39-1fea44ce658c-kube-api-access-xw84k" (OuterVolumeSpecName: "kube-api-access-xw84k") pod "09d5b4b9-e63b-464f-8d39-1fea44ce658c" (UID: "09d5b4b9-e63b-464f-8d39-1fea44ce658c"). InnerVolumeSpecName "kube-api-access-xw84k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.598670 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw84k\" (UniqueName: \"kubernetes.io/projected/09d5b4b9-e63b-464f-8d39-1fea44ce658c-kube-api-access-xw84k\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.901549 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2fbg" event={"ID":"fb534a68-12be-4fcf-9e93-de38766353d0","Type":"ContainerStarted","Data":"95bb319bec1c179b889b9f0a8e5359054dc1c3b89c4e646e5d48ce7444e5d055"} Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.901632 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2fbg" event={"ID":"fb534a68-12be-4fcf-9e93-de38766353d0","Type":"ContainerStarted","Data":"a8dde4c1aee9ce87e4a302da1217db37badd17a0158c1eac03c4b6bb925c806f"} Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.903392 4780 generic.go:334] "Generic (PLEG): container finished" podID="781fb447-43b6-4035-843f-3d51675807bc" containerID="96b76cb996959adad03e821840a3207fa0455cc4ed38fa699e979b42dee2978e" exitCode=0 Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.903452 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" event={"ID":"781fb447-43b6-4035-843f-3d51675807bc","Type":"ContainerDied","Data":"96b76cb996959adad03e821840a3207fa0455cc4ed38fa699e979b42dee2978e"} Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.903473 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" event={"ID":"781fb447-43b6-4035-843f-3d51675807bc","Type":"ContainerStarted","Data":"f106e14988143c193621934e073e382cfe4f7aef3f65a6028d3e040aa77176e0"} Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.905088 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"229da81d-301a-46d2-892b-5ac9b0861ac1","Type":"ContainerStarted","Data":"21bb7ad014be7240b97cf79e0c08c9685ee395c6724cb70aa3fa08ff483f2280"} Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.907287 4780 generic.go:334] "Generic (PLEG): container finished" podID="55b9814e-9fa0-4263-ae85-755a416bb9e3" containerID="98a32bbafb2f45fc948c7d47dbea0ebdafa1adad35ab61bf6a3333439d4dabfa" exitCode=0 Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.907333 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" event={"ID":"55b9814e-9fa0-4263-ae85-755a416bb9e3","Type":"ContainerDied","Data":"98a32bbafb2f45fc948c7d47dbea0ebdafa1adad35ab61bf6a3333439d4dabfa"} Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.907372 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" event={"ID":"55b9814e-9fa0-4263-ae85-755a416bb9e3","Type":"ContainerStarted","Data":"7a12b0af4164e8f883a6f6de4c2a34997c8c5456b77897a38ae9d93ecd255d09"} Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.909547 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a5b9-account-create-9sbx9" Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.909594 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a5b9-account-create-9sbx9" event={"ID":"09d5b4b9-e63b-464f-8d39-1fea44ce658c","Type":"ContainerDied","Data":"e0a5fb003e2406d12ac8a4a47892922c95beb116396ce35ddf8264286731a310"} Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.909627 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0a5fb003e2406d12ac8a4a47892922c95beb116396ce35ddf8264286731a310" Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.910636 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xmmfb" event={"ID":"f89293f3-5080-4326-9f2c-7ba9a2f34280","Type":"ContainerStarted","Data":"f19019125390cc11389824dbcc0a6a152ac67d1b6695f83cdfb2757e004781b3"} Sep 29 19:01:12 crc kubenswrapper[4780]: I0929 19:01:12.922649 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k2fbg" podStartSLOduration=1.9225902000000001 podStartE2EDuration="1.9225902s" podCreationTimestamp="2025-09-29 19:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:12.918809999 +0000 UTC m=+1072.867108043" watchObservedRunningTime="2025-09-29 19:01:12.9225902 +0000 UTC m=+1072.870888244" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.268901 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.418982 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5tlf\" (UniqueName: \"kubernetes.io/projected/55b9814e-9fa0-4263-ae85-755a416bb9e3-kube-api-access-p5tlf\") pod \"55b9814e-9fa0-4263-ae85-755a416bb9e3\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.419098 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-swift-storage-0\") pod \"55b9814e-9fa0-4263-ae85-755a416bb9e3\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.419130 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-config\") pod \"55b9814e-9fa0-4263-ae85-755a416bb9e3\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.419522 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-nb\") pod \"55b9814e-9fa0-4263-ae85-755a416bb9e3\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.419564 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-sb\") pod \"55b9814e-9fa0-4263-ae85-755a416bb9e3\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.419608 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-svc\") pod \"55b9814e-9fa0-4263-ae85-755a416bb9e3\" (UID: \"55b9814e-9fa0-4263-ae85-755a416bb9e3\") " Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.430503 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b9814e-9fa0-4263-ae85-755a416bb9e3-kube-api-access-p5tlf" (OuterVolumeSpecName: "kube-api-access-p5tlf") pod "55b9814e-9fa0-4263-ae85-755a416bb9e3" (UID: "55b9814e-9fa0-4263-ae85-755a416bb9e3"). InnerVolumeSpecName "kube-api-access-p5tlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.446700 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55b9814e-9fa0-4263-ae85-755a416bb9e3" (UID: "55b9814e-9fa0-4263-ae85-755a416bb9e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.446784 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-config" (OuterVolumeSpecName: "config") pod "55b9814e-9fa0-4263-ae85-755a416bb9e3" (UID: "55b9814e-9fa0-4263-ae85-755a416bb9e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.449716 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "55b9814e-9fa0-4263-ae85-755a416bb9e3" (UID: "55b9814e-9fa0-4263-ae85-755a416bb9e3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.451758 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55b9814e-9fa0-4263-ae85-755a416bb9e3" (UID: "55b9814e-9fa0-4263-ae85-755a416bb9e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.459249 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55b9814e-9fa0-4263-ae85-755a416bb9e3" (UID: "55b9814e-9fa0-4263-ae85-755a416bb9e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.522014 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5tlf\" (UniqueName: \"kubernetes.io/projected/55b9814e-9fa0-4263-ae85-755a416bb9e3-kube-api-access-p5tlf\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.522067 4780 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.522086 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.522098 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.522110 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.522123 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b9814e-9fa0-4263-ae85-755a416bb9e3-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.933240 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" event={"ID":"55b9814e-9fa0-4263-ae85-755a416bb9e3","Type":"ContainerDied","Data":"7a12b0af4164e8f883a6f6de4c2a34997c8c5456b77897a38ae9d93ecd255d09"} Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.933328 4780 scope.go:117] "RemoveContainer" containerID="98a32bbafb2f45fc948c7d47dbea0ebdafa1adad35ab61bf6a3333439d4dabfa" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.933376 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f4777664c-7c7mp" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.937835 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" event={"ID":"781fb447-43b6-4035-843f-3d51675807bc","Type":"ContainerStarted","Data":"40b87188a879ca9d86e1be70e598d1e7bd09d2931848a16c8a5eaf1d9217f3f2"} Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.938072 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:13 crc kubenswrapper[4780]: I0929 19:01:13.977100 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" podStartSLOduration=2.977061864 podStartE2EDuration="2.977061864s" podCreationTimestamp="2025-09-29 19:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:13.966490294 +0000 UTC m=+1073.914788348" watchObservedRunningTime="2025-09-29 19:01:13.977061864 +0000 UTC m=+1073.925359918" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.033241 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-7c7mp"] Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.053090 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f4777664c-7c7mp"] Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.286795 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.324566 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-f6rkc"] Sep 29 19:01:14 crc kubenswrapper[4780]: E0929 19:01:14.324975 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d5b4b9-e63b-464f-8d39-1fea44ce658c" containerName="mariadb-account-create" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.324993 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d5b4b9-e63b-464f-8d39-1fea44ce658c" containerName="mariadb-account-create" Sep 29 19:01:14 crc kubenswrapper[4780]: E0929 19:01:14.325041 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b9814e-9fa0-4263-ae85-755a416bb9e3" containerName="init" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.325175 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b9814e-9fa0-4263-ae85-755a416bb9e3" containerName="init" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.325342 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="55b9814e-9fa0-4263-ae85-755a416bb9e3" containerName="init" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.325360 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d5b4b9-e63b-464f-8d39-1fea44ce658c" containerName="mariadb-account-create" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.326194 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.332295 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-44p45" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.332827 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.345172 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f6rkc"] Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.348627 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-combined-ca-bundle\") pod \"barbican-db-sync-f6rkc\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.349374 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2vrj\" (UniqueName: \"kubernetes.io/projected/f4fd579e-b8e6-4845-b5fd-b9291fe94829-kube-api-access-w2vrj\") pod \"barbican-db-sync-f6rkc\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.349442 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-db-sync-config-data\") pod \"barbican-db-sync-f6rkc\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.433357 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rhnjt"] Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.434976 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.450285 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-scripts\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.450328 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-db-sync-config-data\") pod \"barbican-db-sync-f6rkc\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.450351 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-config-data\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.450378 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-db-sync-config-data\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.450399 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3723c568-a926-469d-bda8-99c2a0ed7095-etc-machine-id\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.450419 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-combined-ca-bundle\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.450455 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t826s\" (UniqueName: \"kubernetes.io/projected/3723c568-a926-469d-bda8-99c2a0ed7095-kube-api-access-t826s\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.450487 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-combined-ca-bundle\") pod \"barbican-db-sync-f6rkc\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.450524 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2vrj\" (UniqueName: \"kubernetes.io/projected/f4fd579e-b8e6-4845-b5fd-b9291fe94829-kube-api-access-w2vrj\") pod \"barbican-db-sync-f6rkc\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.455692 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.455918 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.456021 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nmjgd" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.474515 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-db-sync-config-data\") pod \"barbican-db-sync-f6rkc\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.477614 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2vrj\" (UniqueName: \"kubernetes.io/projected/f4fd579e-b8e6-4845-b5fd-b9291fe94829-kube-api-access-w2vrj\") pod \"barbican-db-sync-f6rkc\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.477677 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rhnjt"] Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.492745 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-combined-ca-bundle\") pod \"barbican-db-sync-f6rkc\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.553640 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t826s\" (UniqueName: \"kubernetes.io/projected/3723c568-a926-469d-bda8-99c2a0ed7095-kube-api-access-t826s\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.553862 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-scripts\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.553971 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-config-data\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.554088 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-db-sync-config-data\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.554183 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3723c568-a926-469d-bda8-99c2a0ed7095-etc-machine-id\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.554254 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-combined-ca-bundle\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.561689 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-db-sync-config-data\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.562527 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3723c568-a926-469d-bda8-99c2a0ed7095-etc-machine-id\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.562716 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-scripts\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.571899 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-config-data\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.583500 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-combined-ca-bundle\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.602677 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t826s\" (UniqueName: \"kubernetes.io/projected/3723c568-a926-469d-bda8-99c2a0ed7095-kube-api-access-t826s\") pod \"cinder-db-sync-rhnjt\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.645297 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.652395 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:14 crc kubenswrapper[4780]: I0929 19:01:14.776718 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55b9814e-9fa0-4263-ae85-755a416bb9e3" path="/var/lib/kubelet/pods/55b9814e-9fa0-4263-ae85-755a416bb9e3/volumes" Sep 29 19:01:15 crc kubenswrapper[4780]: I0929 19:01:15.181313 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rhnjt"] Sep 29 19:01:15 crc kubenswrapper[4780]: I0929 19:01:15.287192 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f6rkc"] Sep 29 19:01:15 crc kubenswrapper[4780]: I0929 19:01:15.979029 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rhnjt" event={"ID":"3723c568-a926-469d-bda8-99c2a0ed7095","Type":"ContainerStarted","Data":"d11bad14f678cb07017d151b28f0cc3f55a6e91a28db9fed10eb65a60cfbaa40"} Sep 29 19:01:15 crc kubenswrapper[4780]: I0929 19:01:15.981611 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f6rkc" event={"ID":"f4fd579e-b8e6-4845-b5fd-b9291fe94829","Type":"ContainerStarted","Data":"66f4dd8a95fc2a9cfe44f2eff6463d233c364fccd0f1895ad5a0d18ca96cfff5"} Sep 29 19:01:18 crc kubenswrapper[4780]: I0929 19:01:18.003054 4780 generic.go:334] "Generic (PLEG): container finished" podID="fb534a68-12be-4fcf-9e93-de38766353d0" containerID="95bb319bec1c179b889b9f0a8e5359054dc1c3b89c4e646e5d48ce7444e5d055" exitCode=0 Sep 29 19:01:18 crc kubenswrapper[4780]: I0929 19:01:18.003106 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2fbg" event={"ID":"fb534a68-12be-4fcf-9e93-de38766353d0","Type":"ContainerDied","Data":"95bb319bec1c179b889b9f0a8e5359054dc1c3b89c4e646e5d48ce7444e5d055"} Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.110645 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-33e2-account-create-zvxcq"] Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.112242 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-33e2-account-create-zvxcq" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.115397 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.119303 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-33e2-account-create-zvxcq"] Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.246529 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnwt4\" (UniqueName: \"kubernetes.io/projected/8c1ed602-275e-4595-a3fe-171555e9b681-kube-api-access-fnwt4\") pod \"neutron-33e2-account-create-zvxcq\" (UID: \"8c1ed602-275e-4595-a3fe-171555e9b681\") " pod="openstack/neutron-33e2-account-create-zvxcq" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.348201 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnwt4\" (UniqueName: \"kubernetes.io/projected/8c1ed602-275e-4595-a3fe-171555e9b681-kube-api-access-fnwt4\") pod \"neutron-33e2-account-create-zvxcq\" (UID: \"8c1ed602-275e-4595-a3fe-171555e9b681\") " pod="openstack/neutron-33e2-account-create-zvxcq" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.382104 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnwt4\" (UniqueName: \"kubernetes.io/projected/8c1ed602-275e-4595-a3fe-171555e9b681-kube-api-access-fnwt4\") pod \"neutron-33e2-account-create-zvxcq\" (UID: \"8c1ed602-275e-4595-a3fe-171555e9b681\") " pod="openstack/neutron-33e2-account-create-zvxcq" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.431945 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-33e2-account-create-zvxcq" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.865832 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.957834 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-config-data\") pod \"fb534a68-12be-4fcf-9e93-de38766353d0\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.957915 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-scripts\") pod \"fb534a68-12be-4fcf-9e93-de38766353d0\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.957958 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-credential-keys\") pod \"fb534a68-12be-4fcf-9e93-de38766353d0\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.958021 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r9fx\" (UniqueName: \"kubernetes.io/projected/fb534a68-12be-4fcf-9e93-de38766353d0-kube-api-access-8r9fx\") pod \"fb534a68-12be-4fcf-9e93-de38766353d0\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.958076 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-fernet-keys\") pod \"fb534a68-12be-4fcf-9e93-de38766353d0\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.958263 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-combined-ca-bundle\") pod \"fb534a68-12be-4fcf-9e93-de38766353d0\" (UID: \"fb534a68-12be-4fcf-9e93-de38766353d0\") " Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.964329 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fb534a68-12be-4fcf-9e93-de38766353d0" (UID: "fb534a68-12be-4fcf-9e93-de38766353d0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.964714 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fb534a68-12be-4fcf-9e93-de38766353d0" (UID: "fb534a68-12be-4fcf-9e93-de38766353d0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.964773 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb534a68-12be-4fcf-9e93-de38766353d0-kube-api-access-8r9fx" (OuterVolumeSpecName: "kube-api-access-8r9fx") pod "fb534a68-12be-4fcf-9e93-de38766353d0" (UID: "fb534a68-12be-4fcf-9e93-de38766353d0"). InnerVolumeSpecName "kube-api-access-8r9fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.975462 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-scripts" (OuterVolumeSpecName: "scripts") pod "fb534a68-12be-4fcf-9e93-de38766353d0" (UID: "fb534a68-12be-4fcf-9e93-de38766353d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.988688 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-config-data" (OuterVolumeSpecName: "config-data") pod "fb534a68-12be-4fcf-9e93-de38766353d0" (UID: "fb534a68-12be-4fcf-9e93-de38766353d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:19 crc kubenswrapper[4780]: I0929 19:01:19.990189 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb534a68-12be-4fcf-9e93-de38766353d0" (UID: "fb534a68-12be-4fcf-9e93-de38766353d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.037921 4780 generic.go:334] "Generic (PLEG): container finished" podID="91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" containerID="926f9138263106966b8c488fd9a2f55e330d7666c71e67c054a006eecb80d715" exitCode=0 Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.038260 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvtnb" event={"ID":"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1","Type":"ContainerDied","Data":"926f9138263106966b8c488fd9a2f55e330d7666c71e67c054a006eecb80d715"} Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.043821 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2fbg" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.043848 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2fbg" event={"ID":"fb534a68-12be-4fcf-9e93-de38766353d0","Type":"ContainerDied","Data":"a8dde4c1aee9ce87e4a302da1217db37badd17a0158c1eac03c4b6bb925c806f"} Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.043900 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8dde4c1aee9ce87e4a302da1217db37badd17a0158c1eac03c4b6bb925c806f" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.060113 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r9fx\" (UniqueName: \"kubernetes.io/projected/fb534a68-12be-4fcf-9e93-de38766353d0-kube-api-access-8r9fx\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.060152 4780 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.060165 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.060177 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.060189 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.060201 4780 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb534a68-12be-4fcf-9e93-de38766353d0-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.095679 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k2fbg"] Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.102215 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k2fbg"] Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.194145 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-czsds"] Sep 29 19:01:20 crc kubenswrapper[4780]: E0929 19:01:20.194653 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb534a68-12be-4fcf-9e93-de38766353d0" containerName="keystone-bootstrap" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.194672 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb534a68-12be-4fcf-9e93-de38766353d0" containerName="keystone-bootstrap" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.194998 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb534a68-12be-4fcf-9e93-de38766353d0" containerName="keystone-bootstrap" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.195828 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.198141 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.198589 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.198658 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8gg79" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.198688 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.217461 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-czsds"] Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.268229 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzrm\" (UniqueName: \"kubernetes.io/projected/3f614b85-1709-4020-87c7-c349da7de2c8-kube-api-access-ngzrm\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.268306 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-config-data\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.268382 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-combined-ca-bundle\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.268441 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-fernet-keys\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.268564 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-credential-keys\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.268640 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-scripts\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.369790 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzrm\" (UniqueName: \"kubernetes.io/projected/3f614b85-1709-4020-87c7-c349da7de2c8-kube-api-access-ngzrm\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.369843 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-config-data\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.369894 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-combined-ca-bundle\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.369919 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-fernet-keys\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.369955 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-credential-keys\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.369997 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-scripts\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.373723 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-scripts\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.375120 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-config-data\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.375125 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-fernet-keys\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.376273 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-combined-ca-bundle\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.379577 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-credential-keys\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.389869 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzrm\" (UniqueName: \"kubernetes.io/projected/3f614b85-1709-4020-87c7-c349da7de2c8-kube-api-access-ngzrm\") pod \"keystone-bootstrap-czsds\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.518290 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.811609 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb534a68-12be-4fcf-9e93-de38766353d0" path="/var/lib/kubelet/pods/fb534a68-12be-4fcf-9e93-de38766353d0/volumes" Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.889190 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-33e2-account-create-zvxcq"] Sep 29 19:01:20 crc kubenswrapper[4780]: I0929 19:01:20.930721 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.072039 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xmmfb" event={"ID":"f89293f3-5080-4326-9f2c-7ba9a2f34280","Type":"ContainerStarted","Data":"bdc5bfe5a9b796768563b3c0a298999c5127cb243c728936fc569fce10667de8"} Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.074901 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-33e2-account-create-zvxcq" event={"ID":"8c1ed602-275e-4595-a3fe-171555e9b681","Type":"ContainerStarted","Data":"793c9cc5f6a81b19b49a98206e4c0e94c72b14d7055ea2c2805ac43a560679b6"} Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.077275 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"229da81d-301a-46d2-892b-5ac9b0861ac1","Type":"ContainerStarted","Data":"719bf0b0631d9e71a1709977874890c41e75b7a3ac1293fbcd3b57f90768e4a0"} Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.099325 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xmmfb" podStartSLOduration=2.239256268 podStartE2EDuration="10.099298131s" podCreationTimestamp="2025-09-29 19:01:11 +0000 UTC" firstStartedPulling="2025-09-29 19:01:12.502635516 +0000 UTC m=+1072.450933570" lastFinishedPulling="2025-09-29 19:01:20.362677389 +0000 UTC m=+1080.310975433" observedRunningTime="2025-09-29 19:01:21.094753347 +0000 UTC m=+1081.043051391" watchObservedRunningTime="2025-09-29 19:01:21.099298131 +0000 UTC m=+1081.047596175" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.124025 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-czsds"] Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.609670 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvtnb" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.721875 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-db-sync-config-data\") pod \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.722565 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5r4w\" (UniqueName: \"kubernetes.io/projected/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-kube-api-access-x5r4w\") pod \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.722665 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-combined-ca-bundle\") pod \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.724419 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-config-data\") pod \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\" (UID: \"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1\") " Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.733523 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" (UID: "91d3cfe6-96f0-442a-aa5d-8a08ff10eed1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.742096 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-kube-api-access-x5r4w" (OuterVolumeSpecName: "kube-api-access-x5r4w") pod "91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" (UID: "91d3cfe6-96f0-442a-aa5d-8a08ff10eed1"). InnerVolumeSpecName "kube-api-access-x5r4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.784838 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-config-data" (OuterVolumeSpecName: "config-data") pod "91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" (UID: "91d3cfe6-96f0-442a-aa5d-8a08ff10eed1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.800085 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" (UID: "91d3cfe6-96f0-442a-aa5d-8a08ff10eed1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.828917 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.828944 4780 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.828958 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5r4w\" (UniqueName: \"kubernetes.io/projected/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-kube-api-access-x5r4w\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.828968 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:21 crc kubenswrapper[4780]: I0929 19:01:21.893884 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.038612 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-csl6f"] Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.038918 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" podUID="3b4750fb-fa69-4c76-b3c8-8c250e933533" containerName="dnsmasq-dns" containerID="cri-o://4a56d8c6ff3bd869bd7453e6297a3217d7fae5390ca20a21c7949b36f190b2fb" gracePeriod=10 Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.106482 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czsds" event={"ID":"3f614b85-1709-4020-87c7-c349da7de2c8","Type":"ContainerStarted","Data":"a12d0485b5530686d7a1231632f902193e840b516f0e2c1682e9b5b329c06003"} Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.106545 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czsds" event={"ID":"3f614b85-1709-4020-87c7-c349da7de2c8","Type":"ContainerStarted","Data":"44cf8ae7ae74b24de23fdfb582fe567c3b206ee9b0bafcc788ae5a23e3b23d8e"} Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.116239 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvtnb" event={"ID":"91d3cfe6-96f0-442a-aa5d-8a08ff10eed1","Type":"ContainerDied","Data":"1b359d29b1fdd3296b851acd03710b7b023663fcb7ebca9157d905b5a5173538"} Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.116298 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b359d29b1fdd3296b851acd03710b7b023663fcb7ebca9157d905b5a5173538" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.116392 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvtnb" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.119780 4780 generic.go:334] "Generic (PLEG): container finished" podID="8c1ed602-275e-4595-a3fe-171555e9b681" containerID="4aa1ef08d742f38c18d483e9a34bfc178c7346f4dc1f342627977887f315213f" exitCode=0 Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.119956 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-33e2-account-create-zvxcq" event={"ID":"8c1ed602-275e-4595-a3fe-171555e9b681","Type":"ContainerDied","Data":"4aa1ef08d742f38c18d483e9a34bfc178c7346f4dc1f342627977887f315213f"} Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.139195 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-czsds" podStartSLOduration=2.139168246 podStartE2EDuration="2.139168246s" podCreationTimestamp="2025-09-29 19:01:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:22.132572032 +0000 UTC m=+1082.080870086" watchObservedRunningTime="2025-09-29 19:01:22.139168246 +0000 UTC m=+1082.087466290" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.496933 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5674f66f87-vrjks"] Sep 29 19:01:22 crc kubenswrapper[4780]: E0929 19:01:22.497636 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" containerName="glance-db-sync" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.497647 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" containerName="glance-db-sync" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.497818 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" containerName="glance-db-sync" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.498794 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.519647 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5674f66f87-vrjks"] Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.663034 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-swift-storage-0\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.663136 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-sb\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.663301 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-nb\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.663320 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-svc\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.663410 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-config\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.663522 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drdw9\" (UniqueName: \"kubernetes.io/projected/01004ec9-c3e3-4549-abbf-94af0692c0b1-kube-api-access-drdw9\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.764799 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drdw9\" (UniqueName: \"kubernetes.io/projected/01004ec9-c3e3-4549-abbf-94af0692c0b1-kube-api-access-drdw9\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.764912 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-swift-storage-0\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.764952 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-sb\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.765022 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-svc\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.765062 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-nb\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.765112 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-config\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.766708 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-swift-storage-0\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.766862 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-sb\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.766943 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-svc\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.767137 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-nb\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.767798 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-config\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.793329 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drdw9\" (UniqueName: \"kubernetes.io/projected/01004ec9-c3e3-4549-abbf-94af0692c0b1-kube-api-access-drdw9\") pod \"dnsmasq-dns-5674f66f87-vrjks\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:22 crc kubenswrapper[4780]: I0929 19:01:22.840292 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.100890 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" podUID="3b4750fb-fa69-4c76-b3c8-8c250e933533" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.147401 4780 generic.go:334] "Generic (PLEG): container finished" podID="3b4750fb-fa69-4c76-b3c8-8c250e933533" containerID="4a56d8c6ff3bd869bd7453e6297a3217d7fae5390ca20a21c7949b36f190b2fb" exitCode=0 Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.147516 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" event={"ID":"3b4750fb-fa69-4c76-b3c8-8c250e933533","Type":"ContainerDied","Data":"4a56d8c6ff3bd869bd7453e6297a3217d7fae5390ca20a21c7949b36f190b2fb"} Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.389344 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5674f66f87-vrjks"] Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.413108 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.414656 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.417441 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.417914 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.418131 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-w474c" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.425777 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.593909 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-config-data\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.593978 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-scripts\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.594018 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4f7c\" (UniqueName: \"kubernetes.io/projected/813dc4b7-debd-4338-a194-f349d982e892-kube-api-access-g4f7c\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.594419 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.594460 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-logs\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.594534 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.594645 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.683659 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.697331 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.697396 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-logs\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.697443 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.697492 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.697520 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-config-data\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.697549 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-scripts\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.697577 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4f7c\" (UniqueName: \"kubernetes.io/projected/813dc4b7-debd-4338-a194-f349d982e892-kube-api-access-g4f7c\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.698414 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.700640 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.701190 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.701435 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-logs\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.709608 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.731128 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.743177 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-scripts\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.756477 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-config-data\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.757378 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.784836 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4f7c\" (UniqueName: \"kubernetes.io/projected/813dc4b7-debd-4338-a194-f349d982e892-kube-api-access-g4f7c\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.882419 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.926039 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.926111 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.926144 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.926195 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-logs\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.926234 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.926270 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:23 crc kubenswrapper[4780]: I0929 19:01:23.926288 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltb4j\" (UniqueName: \"kubernetes.io/projected/b876ceca-41c0-4518-8c50-f2667f5c74ad-kube-api-access-ltb4j\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.028111 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-logs\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.028192 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.028236 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.028256 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltb4j\" (UniqueName: \"kubernetes.io/projected/b876ceca-41c0-4518-8c50-f2667f5c74ad-kube-api-access-ltb4j\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.028311 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.028339 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.028363 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.028401 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.028949 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-logs\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.029077 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.033967 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.039006 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.039273 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.044580 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.050619 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltb4j\" (UniqueName: \"kubernetes.io/projected/b876ceca-41c0-4518-8c50-f2667f5c74ad-kube-api-access-ltb4j\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.060177 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.160706 4780 generic.go:334] "Generic (PLEG): container finished" podID="f89293f3-5080-4326-9f2c-7ba9a2f34280" containerID="bdc5bfe5a9b796768563b3c0a298999c5127cb243c728936fc569fce10667de8" exitCode=0 Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.160759 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xmmfb" event={"ID":"f89293f3-5080-4326-9f2c-7ba9a2f34280","Type":"ContainerDied","Data":"bdc5bfe5a9b796768563b3c0a298999c5127cb243c728936fc569fce10667de8"} Sep 29 19:01:24 crc kubenswrapper[4780]: W0929 19:01:24.205531 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01004ec9_c3e3_4549_abbf_94af0692c0b1.slice/crio-825316ce0117aa509bed175269eecfb920d7ed775271bf5915fb6158a44764ef WatchSource:0}: Error finding container 825316ce0117aa509bed175269eecfb920d7ed775271bf5915fb6158a44764ef: Status 404 returned error can't find the container with id 825316ce0117aa509bed175269eecfb920d7ed775271bf5915fb6158a44764ef Sep 29 19:01:24 crc kubenswrapper[4780]: I0929 19:01:24.268671 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:25 crc kubenswrapper[4780]: I0929 19:01:25.193928 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" event={"ID":"01004ec9-c3e3-4549-abbf-94af0692c0b1","Type":"ContainerStarted","Data":"825316ce0117aa509bed175269eecfb920d7ed775271bf5915fb6158a44764ef"} Sep 29 19:01:25 crc kubenswrapper[4780]: I0929 19:01:25.583338 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:01:25 crc kubenswrapper[4780]: I0929 19:01:25.655071 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.209824 4780 generic.go:334] "Generic (PLEG): container finished" podID="3f614b85-1709-4020-87c7-c349da7de2c8" containerID="a12d0485b5530686d7a1231632f902193e840b516f0e2c1682e9b5b329c06003" exitCode=0 Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.210390 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czsds" event={"ID":"3f614b85-1709-4020-87c7-c349da7de2c8","Type":"ContainerDied","Data":"a12d0485b5530686d7a1231632f902193e840b516f0e2c1682e9b5b329c06003"} Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.802772 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.888564 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwfn2\" (UniqueName: \"kubernetes.io/projected/f89293f3-5080-4326-9f2c-7ba9a2f34280-kube-api-access-rwfn2\") pod \"f89293f3-5080-4326-9f2c-7ba9a2f34280\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.888624 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-combined-ca-bundle\") pod \"f89293f3-5080-4326-9f2c-7ba9a2f34280\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.888712 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f89293f3-5080-4326-9f2c-7ba9a2f34280-logs\") pod \"f89293f3-5080-4326-9f2c-7ba9a2f34280\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.888807 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-scripts\") pod \"f89293f3-5080-4326-9f2c-7ba9a2f34280\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.888839 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-config-data\") pod \"f89293f3-5080-4326-9f2c-7ba9a2f34280\" (UID: \"f89293f3-5080-4326-9f2c-7ba9a2f34280\") " Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.893287 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89293f3-5080-4326-9f2c-7ba9a2f34280-kube-api-access-rwfn2" (OuterVolumeSpecName: "kube-api-access-rwfn2") pod "f89293f3-5080-4326-9f2c-7ba9a2f34280" (UID: "f89293f3-5080-4326-9f2c-7ba9a2f34280"). InnerVolumeSpecName "kube-api-access-rwfn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.893652 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89293f3-5080-4326-9f2c-7ba9a2f34280-logs" (OuterVolumeSpecName: "logs") pod "f89293f3-5080-4326-9f2c-7ba9a2f34280" (UID: "f89293f3-5080-4326-9f2c-7ba9a2f34280"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.909196 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-scripts" (OuterVolumeSpecName: "scripts") pod "f89293f3-5080-4326-9f2c-7ba9a2f34280" (UID: "f89293f3-5080-4326-9f2c-7ba9a2f34280"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.918166 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f89293f3-5080-4326-9f2c-7ba9a2f34280" (UID: "f89293f3-5080-4326-9f2c-7ba9a2f34280"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.937093 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-config-data" (OuterVolumeSpecName: "config-data") pod "f89293f3-5080-4326-9f2c-7ba9a2f34280" (UID: "f89293f3-5080-4326-9f2c-7ba9a2f34280"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.991688 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f89293f3-5080-4326-9f2c-7ba9a2f34280-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.991726 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.991737 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.991749 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwfn2\" (UniqueName: \"kubernetes.io/projected/f89293f3-5080-4326-9f2c-7ba9a2f34280-kube-api-access-rwfn2\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:26 crc kubenswrapper[4780]: I0929 19:01:26.991760 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89293f3-5080-4326-9f2c-7ba9a2f34280-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.220506 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xmmfb" event={"ID":"f89293f3-5080-4326-9f2c-7ba9a2f34280","Type":"ContainerDied","Data":"f19019125390cc11389824dbcc0a6a152ac67d1b6695f83cdfb2757e004781b3"} Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.220548 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xmmfb" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.220560 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f19019125390cc11389824dbcc0a6a152ac67d1b6695f83cdfb2757e004781b3" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.780616 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.784869 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-33e2-account-create-zvxcq" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.917867 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mr8m\" (UniqueName: \"kubernetes.io/projected/3b4750fb-fa69-4c76-b3c8-8c250e933533-kube-api-access-5mr8m\") pod \"3b4750fb-fa69-4c76-b3c8-8c250e933533\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.918035 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-config\") pod \"3b4750fb-fa69-4c76-b3c8-8c250e933533\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.918148 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-nb\") pod \"3b4750fb-fa69-4c76-b3c8-8c250e933533\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.918743 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-sb\") pod \"3b4750fb-fa69-4c76-b3c8-8c250e933533\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.923125 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnwt4\" (UniqueName: \"kubernetes.io/projected/8c1ed602-275e-4595-a3fe-171555e9b681-kube-api-access-fnwt4\") pod \"8c1ed602-275e-4595-a3fe-171555e9b681\" (UID: \"8c1ed602-275e-4595-a3fe-171555e9b681\") " Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.923148 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-svc\") pod \"3b4750fb-fa69-4c76-b3c8-8c250e933533\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.923161 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-swift-storage-0\") pod \"3b4750fb-fa69-4c76-b3c8-8c250e933533\" (UID: \"3b4750fb-fa69-4c76-b3c8-8c250e933533\") " Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.927192 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5f568c9c76-zb5pj"] Sep 29 19:01:27 crc kubenswrapper[4780]: E0929 19:01:27.927713 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4750fb-fa69-4c76-b3c8-8c250e933533" containerName="init" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.927735 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4750fb-fa69-4c76-b3c8-8c250e933533" containerName="init" Sep 29 19:01:27 crc kubenswrapper[4780]: E0929 19:01:27.927750 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4750fb-fa69-4c76-b3c8-8c250e933533" containerName="dnsmasq-dns" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.927757 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4750fb-fa69-4c76-b3c8-8c250e933533" containerName="dnsmasq-dns" Sep 29 19:01:27 crc kubenswrapper[4780]: E0929 19:01:27.927813 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89293f3-5080-4326-9f2c-7ba9a2f34280" containerName="placement-db-sync" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.927823 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89293f3-5080-4326-9f2c-7ba9a2f34280" containerName="placement-db-sync" Sep 29 19:01:27 crc kubenswrapper[4780]: E0929 19:01:27.927834 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1ed602-275e-4595-a3fe-171555e9b681" containerName="mariadb-account-create" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.927841 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1ed602-275e-4595-a3fe-171555e9b681" containerName="mariadb-account-create" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.928201 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4750fb-fa69-4c76-b3c8-8c250e933533" containerName="dnsmasq-dns" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.928224 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1ed602-275e-4595-a3fe-171555e9b681" containerName="mariadb-account-create" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.928234 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89293f3-5080-4326-9f2c-7ba9a2f34280" containerName="placement-db-sync" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.929719 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.934252 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.934671 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.935181 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.935327 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tmknm" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.935490 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.941718 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4750fb-fa69-4c76-b3c8-8c250e933533-kube-api-access-5mr8m" (OuterVolumeSpecName: "kube-api-access-5mr8m") pod "3b4750fb-fa69-4c76-b3c8-8c250e933533" (UID: "3b4750fb-fa69-4c76-b3c8-8c250e933533"). InnerVolumeSpecName "kube-api-access-5mr8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.948022 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f568c9c76-zb5pj"] Sep 29 19:01:27 crc kubenswrapper[4780]: I0929 19:01:27.954367 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1ed602-275e-4595-a3fe-171555e9b681-kube-api-access-fnwt4" (OuterVolumeSpecName: "kube-api-access-fnwt4") pod "8c1ed602-275e-4595-a3fe-171555e9b681" (UID: "8c1ed602-275e-4595-a3fe-171555e9b681"). InnerVolumeSpecName "kube-api-access-fnwt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.025560 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnwt4\" (UniqueName: \"kubernetes.io/projected/8c1ed602-275e-4595-a3fe-171555e9b681-kube-api-access-fnwt4\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.025587 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mr8m\" (UniqueName: \"kubernetes.io/projected/3b4750fb-fa69-4c76-b3c8-8c250e933533-kube-api-access-5mr8m\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.027973 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b4750fb-fa69-4c76-b3c8-8c250e933533" (UID: "3b4750fb-fa69-4c76-b3c8-8c250e933533"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.034545 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-config" (OuterVolumeSpecName: "config") pod "3b4750fb-fa69-4c76-b3c8-8c250e933533" (UID: "3b4750fb-fa69-4c76-b3c8-8c250e933533"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.037522 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b4750fb-fa69-4c76-b3c8-8c250e933533" (UID: "3b4750fb-fa69-4c76-b3c8-8c250e933533"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.065304 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b4750fb-fa69-4c76-b3c8-8c250e933533" (UID: "3b4750fb-fa69-4c76-b3c8-8c250e933533"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.094174 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b4750fb-fa69-4c76-b3c8-8c250e933533" (UID: "3b4750fb-fa69-4c76-b3c8-8c250e933533"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127384 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxl4c\" (UniqueName: \"kubernetes.io/projected/6105150b-678d-4925-a981-9a0d75377f32-kube-api-access-zxl4c\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127475 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-public-tls-certs\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127525 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-scripts\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127553 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6105150b-678d-4925-a981-9a0d75377f32-logs\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127581 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-config-data\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127607 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-combined-ca-bundle\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127706 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-internal-tls-certs\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127774 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127801 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127814 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127827 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.127840 4780 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b4750fb-fa69-4c76-b3c8-8c250e933533-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.203253 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.229061 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-scripts\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.229114 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6105150b-678d-4925-a981-9a0d75377f32-logs\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.229147 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-config-data\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.229177 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-combined-ca-bundle\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.229292 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-internal-tls-certs\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.229334 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxl4c\" (UniqueName: \"kubernetes.io/projected/6105150b-678d-4925-a981-9a0d75377f32-kube-api-access-zxl4c\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.229392 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-public-tls-certs\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.234828 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6105150b-678d-4925-a981-9a0d75377f32-logs\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.235674 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-combined-ca-bundle\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.236762 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-scripts\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.238228 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-config-data\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.238428 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-internal-tls-certs\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.239057 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-public-tls-certs\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.240010 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" event={"ID":"3b4750fb-fa69-4c76-b3c8-8c250e933533","Type":"ContainerDied","Data":"9ade6f081c1319fadb048effcb960cc927af1dff4c66be5b4816c4704ce1e880"} Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.240069 4780 scope.go:117] "RemoveContainer" containerID="4a56d8c6ff3bd869bd7453e6297a3217d7fae5390ca20a21c7949b36f190b2fb" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.240235 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfbb96789-csl6f" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.255961 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-33e2-account-create-zvxcq" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.256455 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-33e2-account-create-zvxcq" event={"ID":"8c1ed602-275e-4595-a3fe-171555e9b681","Type":"ContainerDied","Data":"793c9cc5f6a81b19b49a98206e4c0e94c72b14d7055ea2c2805ac43a560679b6"} Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.256483 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="793c9cc5f6a81b19b49a98206e4c0e94c72b14d7055ea2c2805ac43a560679b6" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.259910 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxl4c\" (UniqueName: \"kubernetes.io/projected/6105150b-678d-4925-a981-9a0d75377f32-kube-api-access-zxl4c\") pod \"placement-5f568c9c76-zb5pj\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.275102 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czsds" event={"ID":"3f614b85-1709-4020-87c7-c349da7de2c8","Type":"ContainerDied","Data":"44cf8ae7ae74b24de23fdfb582fe567c3b206ee9b0bafcc788ae5a23e3b23d8e"} Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.275139 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44cf8ae7ae74b24de23fdfb582fe567c3b206ee9b0bafcc788ae5a23e3b23d8e" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.275274 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czsds" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.286874 4780 scope.go:117] "RemoveContainer" containerID="929a871ab45437d0084ba2f48bdddf59853c3cd5746929435fcc27ba0b9d3dd9" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.298117 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-csl6f"] Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.298179 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cfbb96789-csl6f"] Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.330212 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-config-data\") pod \"3f614b85-1709-4020-87c7-c349da7de2c8\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.330328 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-credential-keys\") pod \"3f614b85-1709-4020-87c7-c349da7de2c8\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.330585 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-combined-ca-bundle\") pod \"3f614b85-1709-4020-87c7-c349da7de2c8\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.330625 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngzrm\" (UniqueName: \"kubernetes.io/projected/3f614b85-1709-4020-87c7-c349da7de2c8-kube-api-access-ngzrm\") pod \"3f614b85-1709-4020-87c7-c349da7de2c8\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.330671 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-scripts\") pod \"3f614b85-1709-4020-87c7-c349da7de2c8\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.330726 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-fernet-keys\") pod \"3f614b85-1709-4020-87c7-c349da7de2c8\" (UID: \"3f614b85-1709-4020-87c7-c349da7de2c8\") " Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.339201 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3f614b85-1709-4020-87c7-c349da7de2c8" (UID: "3f614b85-1709-4020-87c7-c349da7de2c8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.339235 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-scripts" (OuterVolumeSpecName: "scripts") pod "3f614b85-1709-4020-87c7-c349da7de2c8" (UID: "3f614b85-1709-4020-87c7-c349da7de2c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.340410 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3f614b85-1709-4020-87c7-c349da7de2c8" (UID: "3f614b85-1709-4020-87c7-c349da7de2c8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.342552 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f614b85-1709-4020-87c7-c349da7de2c8-kube-api-access-ngzrm" (OuterVolumeSpecName: "kube-api-access-ngzrm") pod "3f614b85-1709-4020-87c7-c349da7de2c8" (UID: "3f614b85-1709-4020-87c7-c349da7de2c8"). InnerVolumeSpecName "kube-api-access-ngzrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.361098 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-config-data" (OuterVolumeSpecName: "config-data") pod "3f614b85-1709-4020-87c7-c349da7de2c8" (UID: "3f614b85-1709-4020-87c7-c349da7de2c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.395339 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f614b85-1709-4020-87c7-c349da7de2c8" (UID: "3f614b85-1709-4020-87c7-c349da7de2c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.399210 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-65cff5765c-kflf7"] Sep 29 19:01:28 crc kubenswrapper[4780]: E0929 19:01:28.399763 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f614b85-1709-4020-87c7-c349da7de2c8" containerName="keystone-bootstrap" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.399778 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f614b85-1709-4020-87c7-c349da7de2c8" containerName="keystone-bootstrap" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.399964 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f614b85-1709-4020-87c7-c349da7de2c8" containerName="keystone-bootstrap" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.400658 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.403616 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.404306 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.413124 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65cff5765c-kflf7"] Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.433602 4780 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.433911 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.433924 4780 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.433988 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.434002 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngzrm\" (UniqueName: \"kubernetes.io/projected/3f614b85-1709-4020-87c7-c349da7de2c8-kube-api-access-ngzrm\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.434014 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f614b85-1709-4020-87c7-c349da7de2c8-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.494307 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.553446 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6948l\" (UniqueName: \"kubernetes.io/projected/ef4fe84d-ff10-4ed2-938a-669c30748336-kube-api-access-6948l\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.553554 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-internal-tls-certs\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.553580 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-combined-ca-bundle\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.553601 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-scripts\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.553626 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-fernet-keys\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.554142 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-public-tls-certs\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.554343 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-credential-keys\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.554480 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-config-data\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.586428 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.657385 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-credential-keys\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.657488 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-config-data\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.657546 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6948l\" (UniqueName: \"kubernetes.io/projected/ef4fe84d-ff10-4ed2-938a-669c30748336-kube-api-access-6948l\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.657631 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-internal-tls-certs\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.657675 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-combined-ca-bundle\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.657729 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-scripts\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.657752 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-fernet-keys\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.657804 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-public-tls-certs\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.668204 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-scripts\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.669039 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-combined-ca-bundle\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.669170 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-public-tls-certs\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.670275 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-config-data\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.671093 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-fernet-keys\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.676706 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-internal-tls-certs\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.677261 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-credential-keys\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.684713 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6948l\" (UniqueName: \"kubernetes.io/projected/ef4fe84d-ff10-4ed2-938a-669c30748336-kube-api-access-6948l\") pod \"keystone-65cff5765c-kflf7\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.707594 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:01:28 crc kubenswrapper[4780]: W0929 19:01:28.719318 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb876ceca_41c0_4518_8c50_f2667f5c74ad.slice/crio-5bb5caa2afeeedd9c7a30ccb37c9e974c5ac930f2e52e1ec662bcc068c196fd7 WatchSource:0}: Error finding container 5bb5caa2afeeedd9c7a30ccb37c9e974c5ac930f2e52e1ec662bcc068c196fd7: Status 404 returned error can't find the container with id 5bb5caa2afeeedd9c7a30ccb37c9e974c5ac930f2e52e1ec662bcc068c196fd7 Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.724326 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:28 crc kubenswrapper[4780]: I0929 19:01:28.784732 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b4750fb-fa69-4c76-b3c8-8c250e933533" path="/var/lib/kubelet/pods/3b4750fb-fa69-4c76-b3c8-8c250e933533/volumes" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.067491 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f568c9c76-zb5pj"] Sep 29 19:01:29 crc kubenswrapper[4780]: W0929 19:01:29.078016 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6105150b_678d_4925_a981_9a0d75377f32.slice/crio-1a3ce292de08a638eb8004e833c82b814d95c0296dd191f4aaf9b9bcfadee3a7 WatchSource:0}: Error finding container 1a3ce292de08a638eb8004e833c82b814d95c0296dd191f4aaf9b9bcfadee3a7: Status 404 returned error can't find the container with id 1a3ce292de08a638eb8004e833c82b814d95c0296dd191f4aaf9b9bcfadee3a7 Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.316777 4780 generic.go:334] "Generic (PLEG): container finished" podID="01004ec9-c3e3-4549-abbf-94af0692c0b1" containerID="3b45ce17ef2243b071119c687f635bd1e57c2a3657b1ca7ab22004546aaa3940" exitCode=0 Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.316881 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" event={"ID":"01004ec9-c3e3-4549-abbf-94af0692c0b1","Type":"ContainerDied","Data":"3b45ce17ef2243b071119c687f635bd1e57c2a3657b1ca7ab22004546aaa3940"} Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.327812 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"229da81d-301a-46d2-892b-5ac9b0861ac1","Type":"ContainerStarted","Data":"53d83bf53ca2b73549125815bc42f1206cccfca72f7d048bc647722028b998e7"} Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.340061 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-65cff5765c-kflf7"] Sep 29 19:01:29 crc kubenswrapper[4780]: W0929 19:01:29.340945 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef4fe84d_ff10_4ed2_938a_669c30748336.slice/crio-8af15a8fae669be335a3c5a1557e9cfb34432f48bf945c3a6b737ebed91799b8 WatchSource:0}: Error finding container 8af15a8fae669be335a3c5a1557e9cfb34432f48bf945c3a6b737ebed91799b8: Status 404 returned error can't find the container with id 8af15a8fae669be335a3c5a1557e9cfb34432f48bf945c3a6b737ebed91799b8 Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.364664 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f568c9c76-zb5pj" event={"ID":"6105150b-678d-4925-a981-9a0d75377f32","Type":"ContainerStarted","Data":"1a3ce292de08a638eb8004e833c82b814d95c0296dd191f4aaf9b9bcfadee3a7"} Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.370941 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"813dc4b7-debd-4338-a194-f349d982e892","Type":"ContainerStarted","Data":"815a0ccf2d44ac9a3c2e280110cd90eeca6a39863a8dc20e7ad3dad78f59d3bc"} Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.375304 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b876ceca-41c0-4518-8c50-f2667f5c74ad","Type":"ContainerStarted","Data":"5bb5caa2afeeedd9c7a30ccb37c9e974c5ac930f2e52e1ec662bcc068c196fd7"} Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.527119 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-btjpk"] Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.530459 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.535159 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.535523 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hmghr" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.536586 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.538009 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-btjpk"] Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.696304 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb2dh\" (UniqueName: \"kubernetes.io/projected/2603fb51-f7e5-4212-a85a-2411175cd5d7-kube-api-access-jb2dh\") pod \"neutron-db-sync-btjpk\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.696408 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-config\") pod \"neutron-db-sync-btjpk\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.696440 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-combined-ca-bundle\") pod \"neutron-db-sync-btjpk\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.797746 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-config\") pod \"neutron-db-sync-btjpk\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.797798 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-combined-ca-bundle\") pod \"neutron-db-sync-btjpk\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.797920 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb2dh\" (UniqueName: \"kubernetes.io/projected/2603fb51-f7e5-4212-a85a-2411175cd5d7-kube-api-access-jb2dh\") pod \"neutron-db-sync-btjpk\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.813943 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-config\") pod \"neutron-db-sync-btjpk\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.816615 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb2dh\" (UniqueName: \"kubernetes.io/projected/2603fb51-f7e5-4212-a85a-2411175cd5d7-kube-api-access-jb2dh\") pod \"neutron-db-sync-btjpk\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.816837 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-combined-ca-bundle\") pod \"neutron-db-sync-btjpk\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:29 crc kubenswrapper[4780]: I0929 19:01:29.889498 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.397425 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f6rkc" event={"ID":"f4fd579e-b8e6-4845-b5fd-b9291fe94829","Type":"ContainerStarted","Data":"0df594c76250ec5239387dd545c954b88d180c614903ce4cff7730bbbf798e32"} Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.400625 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" event={"ID":"01004ec9-c3e3-4549-abbf-94af0692c0b1","Type":"ContainerStarted","Data":"a48578dfa62158c3d36a1c8e328dc6d6a0dd83fa657462cb689de3bb8eb360c5"} Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.401492 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.406767 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f568c9c76-zb5pj" event={"ID":"6105150b-678d-4925-a981-9a0d75377f32","Type":"ContainerStarted","Data":"0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec"} Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.406813 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f568c9c76-zb5pj" event={"ID":"6105150b-678d-4925-a981-9a0d75377f32","Type":"ContainerStarted","Data":"eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1"} Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.407190 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.407240 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.409931 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65cff5765c-kflf7" event={"ID":"ef4fe84d-ff10-4ed2-938a-669c30748336","Type":"ContainerStarted","Data":"c0d63b73993fb464d59534865d2e91fd588cd9d7e1421e00fa83f404e4d2d957"} Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.409968 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65cff5765c-kflf7" event={"ID":"ef4fe84d-ff10-4ed2-938a-669c30748336","Type":"ContainerStarted","Data":"8af15a8fae669be335a3c5a1557e9cfb34432f48bf945c3a6b737ebed91799b8"} Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.410766 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.419103 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"813dc4b7-debd-4338-a194-f349d982e892","Type":"ContainerStarted","Data":"571a8931768187d5a692441d954e87a00b8f500ac28a971aecd9b67b81c3cd37"} Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.419345 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"813dc4b7-debd-4338-a194-f349d982e892","Type":"ContainerStarted","Data":"6d4c4fd37e775b6a3b494b5062363c783b262e18dbb1b4d41ab25f2db10f4a7c"} Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.421359 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b876ceca-41c0-4518-8c50-f2667f5c74ad","Type":"ContainerStarted","Data":"71dd688a89f8c618ebc5ee4fecbb7031343e40724fb68c66574637f4983c6194"} Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.442948 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-65cff5765c-kflf7" podStartSLOduration=2.442927181 podStartE2EDuration="2.442927181s" podCreationTimestamp="2025-09-29 19:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:30.442432976 +0000 UTC m=+1090.390731030" watchObservedRunningTime="2025-09-29 19:01:30.442927181 +0000 UTC m=+1090.391225225" Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.446564 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-f6rkc" podStartSLOduration=2.49894751 podStartE2EDuration="16.446541617s" podCreationTimestamp="2025-09-29 19:01:14 +0000 UTC" firstStartedPulling="2025-09-29 19:01:15.29696841 +0000 UTC m=+1075.245266454" lastFinishedPulling="2025-09-29 19:01:29.244562517 +0000 UTC m=+1089.192860561" observedRunningTime="2025-09-29 19:01:30.419350017 +0000 UTC m=+1090.367648081" watchObservedRunningTime="2025-09-29 19:01:30.446541617 +0000 UTC m=+1090.394839661" Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.472356 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5f568c9c76-zb5pj" podStartSLOduration=3.472336945 podStartE2EDuration="3.472336945s" podCreationTimestamp="2025-09-29 19:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:30.465548185 +0000 UTC m=+1090.413846229" watchObservedRunningTime="2025-09-29 19:01:30.472336945 +0000 UTC m=+1090.420634989" Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.510393 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" podStartSLOduration=8.510372803 podStartE2EDuration="8.510372803s" podCreationTimestamp="2025-09-29 19:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:30.500489412 +0000 UTC m=+1090.448787456" watchObservedRunningTime="2025-09-29 19:01:30.510372803 +0000 UTC m=+1090.458670847" Sep 29 19:01:30 crc kubenswrapper[4780]: I0929 19:01:30.519101 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-btjpk"] Sep 29 19:01:30 crc kubenswrapper[4780]: W0929 19:01:30.536306 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2603fb51_f7e5_4212_a85a_2411175cd5d7.slice/crio-a30369a57fd29a27ad05996a05ea923bf6249dab1bbd035f48e7af6c10274222 WatchSource:0}: Error finding container a30369a57fd29a27ad05996a05ea923bf6249dab1bbd035f48e7af6c10274222: Status 404 returned error can't find the container with id a30369a57fd29a27ad05996a05ea923bf6249dab1bbd035f48e7af6c10274222 Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.455396 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b876ceca-41c0-4518-8c50-f2667f5c74ad","Type":"ContainerStarted","Data":"315bd5e91d306f69596be3fbe32c5679147dd79ca3d7ac06edeaadb5f75cd3f5"} Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.455532 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerName="glance-log" containerID="cri-o://71dd688a89f8c618ebc5ee4fecbb7031343e40724fb68c66574637f4983c6194" gracePeriod=30 Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.455612 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerName="glance-httpd" containerID="cri-o://315bd5e91d306f69596be3fbe32c5679147dd79ca3d7ac06edeaadb5f75cd3f5" gracePeriod=30 Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.462170 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="813dc4b7-debd-4338-a194-f349d982e892" containerName="glance-log" containerID="cri-o://6d4c4fd37e775b6a3b494b5062363c783b262e18dbb1b4d41ab25f2db10f4a7c" gracePeriod=30 Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.462691 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-btjpk" event={"ID":"2603fb51-f7e5-4212-a85a-2411175cd5d7","Type":"ContainerStarted","Data":"ada0d3f9808c1bbda5295b4e75f3aac1b8c137677fb1e9e078bfbd6a6f89a728"} Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.462738 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-btjpk" event={"ID":"2603fb51-f7e5-4212-a85a-2411175cd5d7","Type":"ContainerStarted","Data":"a30369a57fd29a27ad05996a05ea923bf6249dab1bbd035f48e7af6c10274222"} Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.462844 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="813dc4b7-debd-4338-a194-f349d982e892" containerName="glance-httpd" containerID="cri-o://571a8931768187d5a692441d954e87a00b8f500ac28a971aecd9b67b81c3cd37" gracePeriod=30 Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.498023 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.497999433 podStartE2EDuration="9.497999433s" podCreationTimestamp="2025-09-29 19:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:31.489496723 +0000 UTC m=+1091.437794787" watchObservedRunningTime="2025-09-29 19:01:31.497999433 +0000 UTC m=+1091.446297477" Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.509579 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-btjpk" podStartSLOduration=2.5095577430000002 podStartE2EDuration="2.509557743s" podCreationTimestamp="2025-09-29 19:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:31.507535683 +0000 UTC m=+1091.455833727" watchObservedRunningTime="2025-09-29 19:01:31.509557743 +0000 UTC m=+1091.457855787" Sep 29 19:01:31 crc kubenswrapper[4780]: I0929 19:01:31.532330 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.532307801 podStartE2EDuration="9.532307801s" podCreationTimestamp="2025-09-29 19:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:31.529628553 +0000 UTC m=+1091.477926597" watchObservedRunningTime="2025-09-29 19:01:31.532307801 +0000 UTC m=+1091.480605845" Sep 29 19:01:32 crc kubenswrapper[4780]: I0929 19:01:32.486443 4780 generic.go:334] "Generic (PLEG): container finished" podID="813dc4b7-debd-4338-a194-f349d982e892" containerID="571a8931768187d5a692441d954e87a00b8f500ac28a971aecd9b67b81c3cd37" exitCode=0 Sep 29 19:01:32 crc kubenswrapper[4780]: I0929 19:01:32.486491 4780 generic.go:334] "Generic (PLEG): container finished" podID="813dc4b7-debd-4338-a194-f349d982e892" containerID="6d4c4fd37e775b6a3b494b5062363c783b262e18dbb1b4d41ab25f2db10f4a7c" exitCode=143 Sep 29 19:01:32 crc kubenswrapper[4780]: I0929 19:01:32.486510 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"813dc4b7-debd-4338-a194-f349d982e892","Type":"ContainerDied","Data":"571a8931768187d5a692441d954e87a00b8f500ac28a971aecd9b67b81c3cd37"} Sep 29 19:01:32 crc kubenswrapper[4780]: I0929 19:01:32.486560 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"813dc4b7-debd-4338-a194-f349d982e892","Type":"ContainerDied","Data":"6d4c4fd37e775b6a3b494b5062363c783b262e18dbb1b4d41ab25f2db10f4a7c"} Sep 29 19:01:32 crc kubenswrapper[4780]: I0929 19:01:32.488013 4780 generic.go:334] "Generic (PLEG): container finished" podID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerID="315bd5e91d306f69596be3fbe32c5679147dd79ca3d7ac06edeaadb5f75cd3f5" exitCode=0 Sep 29 19:01:32 crc kubenswrapper[4780]: I0929 19:01:32.488031 4780 generic.go:334] "Generic (PLEG): container finished" podID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerID="71dd688a89f8c618ebc5ee4fecbb7031343e40724fb68c66574637f4983c6194" exitCode=143 Sep 29 19:01:32 crc kubenswrapper[4780]: I0929 19:01:32.488158 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b876ceca-41c0-4518-8c50-f2667f5c74ad","Type":"ContainerDied","Data":"315bd5e91d306f69596be3fbe32c5679147dd79ca3d7ac06edeaadb5f75cd3f5"} Sep 29 19:01:32 crc kubenswrapper[4780]: I0929 19:01:32.488176 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b876ceca-41c0-4518-8c50-f2667f5c74ad","Type":"ContainerDied","Data":"71dd688a89f8c618ebc5ee4fecbb7031343e40724fb68c66574637f4983c6194"} Sep 29 19:01:33 crc kubenswrapper[4780]: I0929 19:01:33.223217 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:01:33 crc kubenswrapper[4780]: I0929 19:01:33.223664 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:01:33 crc kubenswrapper[4780]: I0929 19:01:33.499645 4780 generic.go:334] "Generic (PLEG): container finished" podID="f4fd579e-b8e6-4845-b5fd-b9291fe94829" containerID="0df594c76250ec5239387dd545c954b88d180c614903ce4cff7730bbbf798e32" exitCode=0 Sep 29 19:01:33 crc kubenswrapper[4780]: I0929 19:01:33.499707 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f6rkc" event={"ID":"f4fd579e-b8e6-4845-b5fd-b9291fe94829","Type":"ContainerDied","Data":"0df594c76250ec5239387dd545c954b88d180c614903ce4cff7730bbbf798e32"} Sep 29 19:01:37 crc kubenswrapper[4780]: I0929 19:01:37.843236 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:01:37 crc kubenswrapper[4780]: I0929 19:01:37.917032 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-jfqfw"] Sep 29 19:01:37 crc kubenswrapper[4780]: I0929 19:01:37.917433 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" podUID="781fb447-43b6-4035-843f-3d51675807bc" containerName="dnsmasq-dns" containerID="cri-o://40b87188a879ca9d86e1be70e598d1e7bd09d2931848a16c8a5eaf1d9217f3f2" gracePeriod=10 Sep 29 19:01:38 crc kubenswrapper[4780]: I0929 19:01:38.566906 4780 generic.go:334] "Generic (PLEG): container finished" podID="781fb447-43b6-4035-843f-3d51675807bc" containerID="40b87188a879ca9d86e1be70e598d1e7bd09d2931848a16c8a5eaf1d9217f3f2" exitCode=0 Sep 29 19:01:38 crc kubenswrapper[4780]: I0929 19:01:38.566966 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" event={"ID":"781fb447-43b6-4035-843f-3d51675807bc","Type":"ContainerDied","Data":"40b87188a879ca9d86e1be70e598d1e7bd09d2931848a16c8a5eaf1d9217f3f2"} Sep 29 19:01:41 crc kubenswrapper[4780]: I0929 19:01:41.891778 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" podUID="781fb447-43b6-4035-843f-3d51675807bc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Sep 29 19:01:43 crc kubenswrapper[4780]: E0929 19:01:43.492382 4780 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695" Sep 29 19:01:43 crc kubenswrapper[4780]: E0929 19:01:43.493791 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t826s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rhnjt_openstack(3723c568-a926-469d-bda8-99c2a0ed7095): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 19:01:43 crc kubenswrapper[4780]: E0929 19:01:43.496159 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rhnjt" podUID="3723c568-a926-469d-bda8-99c2a0ed7095" Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.563238 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.624183 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f6rkc" Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.624281 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f6rkc" event={"ID":"f4fd579e-b8e6-4845-b5fd-b9291fe94829","Type":"ContainerDied","Data":"66f4dd8a95fc2a9cfe44f2eff6463d233c364fccd0f1895ad5a0d18ca96cfff5"} Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.624444 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66f4dd8a95fc2a9cfe44f2eff6463d233c364fccd0f1895ad5a0d18ca96cfff5" Sep 29 19:01:43 crc kubenswrapper[4780]: E0929 19:01:43.628024 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:e318869f706836a0c74c0ad55aab277b1bb7fae0555ae0f03cb28b379b9ce695\\\"\"" pod="openstack/cinder-db-sync-rhnjt" podUID="3723c568-a926-469d-bda8-99c2a0ed7095" Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.703171 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-combined-ca-bundle\") pod \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.703227 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-db-sync-config-data\") pod \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.703429 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2vrj\" (UniqueName: \"kubernetes.io/projected/f4fd579e-b8e6-4845-b5fd-b9291fe94829-kube-api-access-w2vrj\") pod \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\" (UID: \"f4fd579e-b8e6-4845-b5fd-b9291fe94829\") " Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.716415 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f4fd579e-b8e6-4845-b5fd-b9291fe94829" (UID: "f4fd579e-b8e6-4845-b5fd-b9291fe94829"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.716621 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4fd579e-b8e6-4845-b5fd-b9291fe94829-kube-api-access-w2vrj" (OuterVolumeSpecName: "kube-api-access-w2vrj") pod "f4fd579e-b8e6-4845-b5fd-b9291fe94829" (UID: "f4fd579e-b8e6-4845-b5fd-b9291fe94829"). InnerVolumeSpecName "kube-api-access-w2vrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.735948 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4fd579e-b8e6-4845-b5fd-b9291fe94829" (UID: "f4fd579e-b8e6-4845-b5fd-b9291fe94829"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.805509 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2vrj\" (UniqueName: \"kubernetes.io/projected/f4fd579e-b8e6-4845-b5fd-b9291fe94829-kube-api-access-w2vrj\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.805535 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:43 crc kubenswrapper[4780]: I0929 19:01:43.805545 4780 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f4fd579e-b8e6-4845-b5fd-b9291fe94829-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.845361 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-74988cff4c-fmczd"] Sep 29 19:01:44 crc kubenswrapper[4780]: E0929 19:01:44.846174 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fd579e-b8e6-4845-b5fd-b9291fe94829" containerName="barbican-db-sync" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.846192 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fd579e-b8e6-4845-b5fd-b9291fe94829" containerName="barbican-db-sync" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.846440 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fd579e-b8e6-4845-b5fd-b9291fe94829" containerName="barbican-db-sync" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.849240 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.855006 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-44p45" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.855338 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.855560 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.873376 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74988cff4c-fmczd"] Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.900310 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.914170 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.937090 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data-custom\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.937504 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-logs\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.937574 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-combined-ca-bundle\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.937638 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.937690 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxdms\" (UniqueName: \"kubernetes.io/projected/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-kube-api-access-pxdms\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.977110 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-79b866b5dd-2f72g"] Sep 29 19:01:44 crc kubenswrapper[4780]: E0929 19:01:44.977536 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781fb447-43b6-4035-843f-3d51675807bc" containerName="dnsmasq-dns" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.977548 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="781fb447-43b6-4035-843f-3d51675807bc" containerName="dnsmasq-dns" Sep 29 19:01:44 crc kubenswrapper[4780]: E0929 19:01:44.977567 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781fb447-43b6-4035-843f-3d51675807bc" containerName="init" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.977573 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="781fb447-43b6-4035-843f-3d51675807bc" containerName="init" Sep 29 19:01:44 crc kubenswrapper[4780]: E0929 19:01:44.977592 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerName="glance-httpd" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.977597 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerName="glance-httpd" Sep 29 19:01:44 crc kubenswrapper[4780]: E0929 19:01:44.977612 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerName="glance-log" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.977617 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerName="glance-log" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.977779 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="781fb447-43b6-4035-843f-3d51675807bc" containerName="dnsmasq-dns" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.977806 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerName="glance-log" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.977816 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876ceca-41c0-4518-8c50-f2667f5c74ad" containerName="glance-httpd" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.978733 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.987390 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 29 19:01:44 crc kubenswrapper[4780]: I0929 19:01:44.991840 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-79b866b5dd-2f72g"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.044756 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b876ceca-41c0-4518-8c50-f2667f5c74ad\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.044810 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-swift-storage-0\") pod \"781fb447-43b6-4035-843f-3d51675807bc\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.044844 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-sb\") pod \"781fb447-43b6-4035-843f-3d51675807bc\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.044901 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-config-data\") pod \"b876ceca-41c0-4518-8c50-f2667f5c74ad\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.045758 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-scripts\") pod \"b876ceca-41c0-4518-8c50-f2667f5c74ad\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.045781 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-nb\") pod \"781fb447-43b6-4035-843f-3d51675807bc\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.045804 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-logs\") pod \"b876ceca-41c0-4518-8c50-f2667f5c74ad\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.045829 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-svc\") pod \"781fb447-43b6-4035-843f-3d51675807bc\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.045882 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-httpd-run\") pod \"b876ceca-41c0-4518-8c50-f2667f5c74ad\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.045916 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sw9l\" (UniqueName: \"kubernetes.io/projected/781fb447-43b6-4035-843f-3d51675807bc-kube-api-access-5sw9l\") pod \"781fb447-43b6-4035-843f-3d51675807bc\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.045932 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-combined-ca-bundle\") pod \"b876ceca-41c0-4518-8c50-f2667f5c74ad\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.045948 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltb4j\" (UniqueName: \"kubernetes.io/projected/b876ceca-41c0-4518-8c50-f2667f5c74ad-kube-api-access-ltb4j\") pod \"b876ceca-41c0-4518-8c50-f2667f5c74ad\" (UID: \"b876ceca-41c0-4518-8c50-f2667f5c74ad\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046014 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-config\") pod \"781fb447-43b6-4035-843f-3d51675807bc\" (UID: \"781fb447-43b6-4035-843f-3d51675807bc\") " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046245 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdms\" (UniqueName: \"kubernetes.io/projected/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-kube-api-access-pxdms\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046331 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-combined-ca-bundle\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046385 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data-custom\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046429 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxw5\" (UniqueName: \"kubernetes.io/projected/8e1d2b75-0893-468d-8365-f08fa8875575-kube-api-access-kmxw5\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046505 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046529 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data-custom\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046553 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-logs\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046584 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-combined-ca-bundle\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046617 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e1d2b75-0893-468d-8365-f08fa8875575-logs\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.046650 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.049654 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-logs\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.059203 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "b876ceca-41c0-4518-8c50-f2667f5c74ad" (UID: "b876ceca-41c0-4518-8c50-f2667f5c74ad"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.059455 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b876ceca-41c0-4518-8c50-f2667f5c74ad" (UID: "b876ceca-41c0-4518-8c50-f2667f5c74ad"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.081913 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.098569 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data-custom\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.104573 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-combined-ca-bundle\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.109301 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/781fb447-43b6-4035-843f-3d51675807bc-kube-api-access-5sw9l" (OuterVolumeSpecName: "kube-api-access-5sw9l") pod "781fb447-43b6-4035-843f-3d51675807bc" (UID: "781fb447-43b6-4035-843f-3d51675807bc"). InnerVolumeSpecName "kube-api-access-5sw9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.116178 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-logs" (OuterVolumeSpecName: "logs") pod "b876ceca-41c0-4518-8c50-f2667f5c74ad" (UID: "b876ceca-41c0-4518-8c50-f2667f5c74ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.119364 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-scripts" (OuterVolumeSpecName: "scripts") pod "b876ceca-41c0-4518-8c50-f2667f5c74ad" (UID: "b876ceca-41c0-4518-8c50-f2667f5c74ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.126657 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b876ceca-41c0-4518-8c50-f2667f5c74ad-kube-api-access-ltb4j" (OuterVolumeSpecName: "kube-api-access-ltb4j") pod "b876ceca-41c0-4518-8c50-f2667f5c74ad" (UID: "b876ceca-41c0-4518-8c50-f2667f5c74ad"). InnerVolumeSpecName "kube-api-access-ltb4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.136608 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxdms\" (UniqueName: \"kubernetes.io/projected/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-kube-api-access-pxdms\") pod \"barbican-worker-74988cff4c-fmczd\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151621 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxw5\" (UniqueName: \"kubernetes.io/projected/8e1d2b75-0893-468d-8365-f08fa8875575-kube-api-access-kmxw5\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151670 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151697 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data-custom\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151737 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e1d2b75-0893-468d-8365-f08fa8875575-logs\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151810 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-combined-ca-bundle\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151868 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151878 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151886 4780 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b876ceca-41c0-4518-8c50-f2667f5c74ad-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151895 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sw9l\" (UniqueName: \"kubernetes.io/projected/781fb447-43b6-4035-843f-3d51675807bc-kube-api-access-5sw9l\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.151943 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltb4j\" (UniqueName: \"kubernetes.io/projected/b876ceca-41c0-4518-8c50-f2667f5c74ad-kube-api-access-ltb4j\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.152166 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.156648 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e1d2b75-0893-468d-8365-f08fa8875575-logs\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.183475 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-6szl6"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.185288 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.207751 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-combined-ca-bundle\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.214509 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.217399 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.229310 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data-custom\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.241970 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxw5\" (UniqueName: \"kubernetes.io/projected/8e1d2b75-0893-468d-8365-f08fa8875575-kube-api-access-kmxw5\") pod \"barbican-keystone-listener-79b866b5dd-2f72g\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.303200 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b876ceca-41c0-4518-8c50-f2667f5c74ad" (UID: "b876ceca-41c0-4518-8c50-f2667f5c74ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.333354 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-6szl6"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.337880 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.359978 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.369617 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48xs4\" (UniqueName: \"kubernetes.io/projected/8d951f39-e623-4b94-ab75-c47c5ea91095-kube-api-access-48xs4\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.369669 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-swift-storage-0\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.369709 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-config\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.369797 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-nb\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.369884 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-sb\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.369929 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-svc\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.369983 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.369997 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.387355 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "781fb447-43b6-4035-843f-3d51675807bc" (UID: "781fb447-43b6-4035-843f-3d51675807bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.391247 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "781fb447-43b6-4035-843f-3d51675807bc" (UID: "781fb447-43b6-4035-843f-3d51675807bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.404638 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-config" (OuterVolumeSpecName: "config") pod "781fb447-43b6-4035-843f-3d51675807bc" (UID: "781fb447-43b6-4035-843f-3d51675807bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.413501 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-config-data" (OuterVolumeSpecName: "config-data") pod "b876ceca-41c0-4518-8c50-f2667f5c74ad" (UID: "b876ceca-41c0-4518-8c50-f2667f5c74ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.413670 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "781fb447-43b6-4035-843f-3d51675807bc" (UID: "781fb447-43b6-4035-843f-3d51675807bc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.417489 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "781fb447-43b6-4035-843f-3d51675807bc" (UID: "781fb447-43b6-4035-843f-3d51675807bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.431176 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55dc4b9644-4fsqf"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.434517 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55dc4b9644-4fsqf"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.434648 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.437029 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.472885 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48xs4\" (UniqueName: \"kubernetes.io/projected/8d951f39-e623-4b94-ab75-c47c5ea91095-kube-api-access-48xs4\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.472955 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-swift-storage-0\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.472996 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-config\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.473090 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-nb\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.473171 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-sb\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.473210 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-svc\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.473273 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.473310 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.473324 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.473336 4780 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.473350 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/781fb447-43b6-4035-843f-3d51675807bc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.473364 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b876ceca-41c0-4518-8c50-f2667f5c74ad-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.474217 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-svc\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.474867 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-swift-storage-0\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.475233 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-nb\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.475663 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-config\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.477646 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-sb\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.493613 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48xs4\" (UniqueName: \"kubernetes.io/projected/8d951f39-e623-4b94-ab75-c47c5ea91095-kube-api-access-48xs4\") pod \"dnsmasq-dns-5c856dc5f9-6szl6\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.574417 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-combined-ca-bundle\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.574495 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqn26\" (UniqueName: \"kubernetes.io/projected/fd61d23a-532f-4ca8-aa16-396c1390d9fa-kube-api-access-cqn26\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.574536 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.574557 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd61d23a-532f-4ca8-aa16-396c1390d9fa-logs\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.574614 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data-custom\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.645539 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b876ceca-41c0-4518-8c50-f2667f5c74ad","Type":"ContainerDied","Data":"5bb5caa2afeeedd9c7a30ccb37c9e974c5ac930f2e52e1ec662bcc068c196fd7"} Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.645596 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.645840 4780 scope.go:117] "RemoveContainer" containerID="315bd5e91d306f69596be3fbe32c5679147dd79ca3d7ac06edeaadb5f75cd3f5" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.649028 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" event={"ID":"781fb447-43b6-4035-843f-3d51675807bc","Type":"ContainerDied","Data":"f106e14988143c193621934e073e382cfe4f7aef3f65a6028d3e040aa77176e0"} Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.649079 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4dc449d9-jfqfw" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.676560 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqn26\" (UniqueName: \"kubernetes.io/projected/fd61d23a-532f-4ca8-aa16-396c1390d9fa-kube-api-access-cqn26\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.676633 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.676655 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd61d23a-532f-4ca8-aa16-396c1390d9fa-logs\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.676712 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data-custom\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.676767 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-combined-ca-bundle\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.677415 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd61d23a-532f-4ca8-aa16-396c1390d9fa-logs\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.681099 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.685442 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data-custom\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.688507 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-combined-ca-bundle\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.689204 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.694107 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqn26\" (UniqueName: \"kubernetes.io/projected/fd61d23a-532f-4ca8-aa16-396c1390d9fa-kube-api-access-cqn26\") pod \"barbican-api-55dc4b9644-4fsqf\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.703919 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-jfqfw"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.730746 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b4dc449d9-jfqfw"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.757792 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.773423 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.780097 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.805980 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.813714 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.816990 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.819033 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.819296 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.981486 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.981550 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.981691 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.981833 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.981906 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-logs\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.981952 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.982007 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pnvb\" (UniqueName: \"kubernetes.io/projected/60211468-1dd1-4611-9009-cba4f4194aad-kube-api-access-6pnvb\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:45 crc kubenswrapper[4780]: I0929 19:01:45.982039 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.083701 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.083795 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.083841 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.083892 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.083936 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.083966 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-logs\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.083985 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.084357 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pnvb\" (UniqueName: \"kubernetes.io/projected/60211468-1dd1-4611-9009-cba4f4194aad-kube-api-access-6pnvb\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.084651 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.084671 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-logs\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.084753 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.093158 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.097701 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.107914 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pnvb\" (UniqueName: \"kubernetes.io/projected/60211468-1dd1-4611-9009-cba4f4194aad-kube-api-access-6pnvb\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.114792 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.118334 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.119664 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.137510 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.769353 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="781fb447-43b6-4035-843f-3d51675807bc" path="/var/lib/kubelet/pods/781fb447-43b6-4035-843f-3d51675807bc/volumes" Sep 29 19:01:46 crc kubenswrapper[4780]: I0929 19:01:46.770784 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b876ceca-41c0-4518-8c50-f2667f5c74ad" path="/var/lib/kubelet/pods/b876ceca-41c0-4518-8c50-f2667f5c74ad/volumes" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.164328 4780 scope.go:117] "RemoveContainer" containerID="71dd688a89f8c618ebc5ee4fecbb7031343e40724fb68c66574637f4983c6194" Sep 29 19:01:47 crc kubenswrapper[4780]: E0929 19:01:47.167693 4780 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1" Sep 29 19:01:47 crc kubenswrapper[4780]: E0929 19:01:47.167843 4780 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2qkpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(229da81d-301a-46d2-892b-5ac9b0861ac1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.288261 4780 scope.go:117] "RemoveContainer" containerID="40b87188a879ca9d86e1be70e598d1e7bd09d2931848a16c8a5eaf1d9217f3f2" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.356708 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.368437 4780 scope.go:117] "RemoveContainer" containerID="96b76cb996959adad03e821840a3207fa0455cc4ed38fa699e979b42dee2978e" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.515806 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-combined-ca-bundle\") pod \"813dc4b7-debd-4338-a194-f349d982e892\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.515848 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-httpd-run\") pod \"813dc4b7-debd-4338-a194-f349d982e892\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.515925 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-logs\") pod \"813dc4b7-debd-4338-a194-f349d982e892\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.515952 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-scripts\") pod \"813dc4b7-debd-4338-a194-f349d982e892\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.516029 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"813dc4b7-debd-4338-a194-f349d982e892\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.516108 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4f7c\" (UniqueName: \"kubernetes.io/projected/813dc4b7-debd-4338-a194-f349d982e892-kube-api-access-g4f7c\") pod \"813dc4b7-debd-4338-a194-f349d982e892\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.516170 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-config-data\") pod \"813dc4b7-debd-4338-a194-f349d982e892\" (UID: \"813dc4b7-debd-4338-a194-f349d982e892\") " Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.519238 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "813dc4b7-debd-4338-a194-f349d982e892" (UID: "813dc4b7-debd-4338-a194-f349d982e892"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.519481 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-logs" (OuterVolumeSpecName: "logs") pod "813dc4b7-debd-4338-a194-f349d982e892" (UID: "813dc4b7-debd-4338-a194-f349d982e892"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.527207 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "813dc4b7-debd-4338-a194-f349d982e892" (UID: "813dc4b7-debd-4338-a194-f349d982e892"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.529817 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-scripts" (OuterVolumeSpecName: "scripts") pod "813dc4b7-debd-4338-a194-f349d982e892" (UID: "813dc4b7-debd-4338-a194-f349d982e892"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.530938 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813dc4b7-debd-4338-a194-f349d982e892-kube-api-access-g4f7c" (OuterVolumeSpecName: "kube-api-access-g4f7c") pod "813dc4b7-debd-4338-a194-f349d982e892" (UID: "813dc4b7-debd-4338-a194-f349d982e892"). InnerVolumeSpecName "kube-api-access-g4f7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.561987 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "813dc4b7-debd-4338-a194-f349d982e892" (UID: "813dc4b7-debd-4338-a194-f349d982e892"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.626505 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.626560 4780 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.626573 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/813dc4b7-debd-4338-a194-f349d982e892-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.626584 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.626618 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.626632 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4f7c\" (UniqueName: \"kubernetes.io/projected/813dc4b7-debd-4338-a194-f349d982e892-kube-api-access-g4f7c\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.727210 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-config-data" (OuterVolumeSpecName: "config-data") pod "813dc4b7-debd-4338-a194-f349d982e892" (UID: "813dc4b7-debd-4338-a194-f349d982e892"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.728662 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813dc4b7-debd-4338-a194-f349d982e892-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.749471 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.749745 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"813dc4b7-debd-4338-a194-f349d982e892","Type":"ContainerDied","Data":"815a0ccf2d44ac9a3c2e280110cd90eeca6a39863a8dc20e7ad3dad78f59d3bc"} Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.749804 4780 scope.go:117] "RemoveContainer" containerID="571a8931768187d5a692441d954e87a00b8f500ac28a971aecd9b67b81c3cd37" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.750851 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.792886 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74988cff4c-fmczd"] Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.815804 4780 scope.go:117] "RemoveContainer" containerID="6d4c4fd37e775b6a3b494b5062363c783b262e18dbb1b4d41ab25f2db10f4a7c" Sep 29 19:01:47 crc kubenswrapper[4780]: W0929 19:01:47.822295 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8150bb34_1bc0_4c45_92f8_9d8d04f611e3.slice/crio-2ed2f9a0db4527a7eb67b707e992fa96d67a3759ccc3aab3d8b5da1a78ed0e75 WatchSource:0}: Error finding container 2ed2f9a0db4527a7eb67b707e992fa96d67a3759ccc3aab3d8b5da1a78ed0e75: Status 404 returned error can't find the container with id 2ed2f9a0db4527a7eb67b707e992fa96d67a3759ccc3aab3d8b5da1a78ed0e75 Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.830248 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.838813 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.846951 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.878680 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:01:47 crc kubenswrapper[4780]: E0929 19:01:47.879101 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813dc4b7-debd-4338-a194-f349d982e892" containerName="glance-httpd" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.879113 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="813dc4b7-debd-4338-a194-f349d982e892" containerName="glance-httpd" Sep 29 19:01:47 crc kubenswrapper[4780]: E0929 19:01:47.879134 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813dc4b7-debd-4338-a194-f349d982e892" containerName="glance-log" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.879142 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="813dc4b7-debd-4338-a194-f349d982e892" containerName="glance-log" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.879305 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="813dc4b7-debd-4338-a194-f349d982e892" containerName="glance-log" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.879319 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="813dc4b7-debd-4338-a194-f349d982e892" containerName="glance-httpd" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.884414 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.886679 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.887169 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 19:01:47 crc kubenswrapper[4780]: I0929 19:01:47.899782 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.043985 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.045096 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.045162 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.045237 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.045314 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.045538 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-logs\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.045709 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkbp5\" (UniqueName: \"kubernetes.io/projected/a7a98719-c1af-40eb-a2e2-b711001d277c-kube-api-access-mkbp5\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.045747 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.052214 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-6szl6"] Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.153915 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.153997 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-logs\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.154077 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkbp5\" (UniqueName: \"kubernetes.io/projected/a7a98719-c1af-40eb-a2e2-b711001d277c-kube-api-access-mkbp5\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.154104 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.154155 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.154181 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.154204 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.154240 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.154584 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.156298 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.158096 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.161360 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-logs\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.161787 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.162099 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.166835 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.166865 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.187945 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkbp5\" (UniqueName: \"kubernetes.io/projected/a7a98719-c1af-40eb-a2e2-b711001d277c-kube-api-access-mkbp5\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.188386 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.203119 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.277537 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-79b866b5dd-2f72g"] Sep 29 19:01:48 crc kubenswrapper[4780]: W0929 19:01:48.291594 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e1d2b75_0893_468d_8365_f08fa8875575.slice/crio-3c9d220b4cfd41006d2222c21a7022c7d464d547797c016309f69d63385cbc99 WatchSource:0}: Error finding container 3c9d220b4cfd41006d2222c21a7022c7d464d547797c016309f69d63385cbc99: Status 404 returned error can't find the container with id 3c9d220b4cfd41006d2222c21a7022c7d464d547797c016309f69d63385cbc99 Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.297268 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55dc4b9644-4fsqf"] Sep 29 19:01:48 crc kubenswrapper[4780]: W0929 19:01:48.299157 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd61d23a_532f_4ca8_aa16_396c1390d9fa.slice/crio-ed8143d8f7fd0e79ca35866f3049b270f37b252d88f9de6b850f8f0d9cf36310 WatchSource:0}: Error finding container ed8143d8f7fd0e79ca35866f3049b270f37b252d88f9de6b850f8f0d9cf36310: Status 404 returned error can't find the container with id ed8143d8f7fd0e79ca35866f3049b270f37b252d88f9de6b850f8f0d9cf36310 Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.538065 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6764d576f6-q7trv"] Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.554981 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.563468 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.563653 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.583752 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6764d576f6-q7trv"] Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.679513 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data-custom\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.679585 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-public-tls-certs\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.679633 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.679663 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-combined-ca-bundle\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.679693 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grl7p\" (UniqueName: \"kubernetes.io/projected/6c538b0f-23b3-440d-9775-5f33f7badfd4-kube-api-access-grl7p\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.679718 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-internal-tls-certs\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.679917 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c538b0f-23b3-440d-9775-5f33f7badfd4-logs\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.780715 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813dc4b7-debd-4338-a194-f349d982e892" path="/var/lib/kubelet/pods/813dc4b7-debd-4338-a194-f349d982e892/volumes" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.781624 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c538b0f-23b3-440d-9775-5f33f7badfd4-logs\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.781782 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data-custom\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.781882 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-public-tls-certs\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.781991 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.782031 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-combined-ca-bundle\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.782080 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grl7p\" (UniqueName: \"kubernetes.io/projected/6c538b0f-23b3-440d-9775-5f33f7badfd4-kube-api-access-grl7p\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.782110 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-internal-tls-certs\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.787581 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-internal-tls-certs\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.787903 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c538b0f-23b3-440d-9775-5f33f7badfd4-logs\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.792154 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74988cff4c-fmczd" event={"ID":"8150bb34-1bc0-4c45-92f8-9d8d04f611e3","Type":"ContainerStarted","Data":"2ed2f9a0db4527a7eb67b707e992fa96d67a3759ccc3aab3d8b5da1a78ed0e75"} Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.793457 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-public-tls-certs\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.797394 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data-custom\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.797608 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.798608 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-combined-ca-bundle\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.807815 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" event={"ID":"8e1d2b75-0893-468d-8365-f08fa8875575","Type":"ContainerStarted","Data":"3c9d220b4cfd41006d2222c21a7022c7d464d547797c016309f69d63385cbc99"} Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.814163 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grl7p\" (UniqueName: \"kubernetes.io/projected/6c538b0f-23b3-440d-9775-5f33f7badfd4-kube-api-access-grl7p\") pod \"barbican-api-6764d576f6-q7trv\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.853594 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60211468-1dd1-4611-9009-cba4f4194aad","Type":"ContainerStarted","Data":"daf6d18ccdc14224060c6f099e8325aafb29cbb481423708cac3d8cea2f1c31e"} Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.856434 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dc4b9644-4fsqf" event={"ID":"fd61d23a-532f-4ca8-aa16-396c1390d9fa","Type":"ContainerStarted","Data":"2d6c0e2efc44fbbe01f7ffc27d4adeac10c4f1be0262bfbba5f0c30472f1b3c5"} Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.856481 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dc4b9644-4fsqf" event={"ID":"fd61d23a-532f-4ca8-aa16-396c1390d9fa","Type":"ContainerStarted","Data":"ed8143d8f7fd0e79ca35866f3049b270f37b252d88f9de6b850f8f0d9cf36310"} Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.862895 4780 generic.go:334] "Generic (PLEG): container finished" podID="8d951f39-e623-4b94-ab75-c47c5ea91095" containerID="9a922f3781c1132b2f38a1942407071ce610d76c6cadd603bf618999b74c58b5" exitCode=0 Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.862951 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" event={"ID":"8d951f39-e623-4b94-ab75-c47c5ea91095","Type":"ContainerDied","Data":"9a922f3781c1132b2f38a1942407071ce610d76c6cadd603bf618999b74c58b5"} Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.862994 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" event={"ID":"8d951f39-e623-4b94-ab75-c47c5ea91095","Type":"ContainerStarted","Data":"26d1c0fde801840eef89b19003b55e2b1fbe251b89aae656a8cb816cce6def09"} Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.935130 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:01:48 crc kubenswrapper[4780]: I0929 19:01:48.948034 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:01:48 crc kubenswrapper[4780]: W0929 19:01:48.990944 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7a98719_c1af_40eb_a2e2_b711001d277c.slice/crio-070e9830cfb317193a6f8fb2068ee4293313036cd87383713ef169bcda15daf5 WatchSource:0}: Error finding container 070e9830cfb317193a6f8fb2068ee4293313036cd87383713ef169bcda15daf5: Status 404 returned error can't find the container with id 070e9830cfb317193a6f8fb2068ee4293313036cd87383713ef169bcda15daf5 Sep 29 19:01:49 crc kubenswrapper[4780]: I0929 19:01:49.689889 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6764d576f6-q7trv"] Sep 29 19:01:49 crc kubenswrapper[4780]: W0929 19:01:49.723425 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c538b0f_23b3_440d_9775_5f33f7badfd4.slice/crio-4f7a00a71bb47ebc0f497edccce28d5b462dfb6f910a3b73f73c4fdced01a9ce WatchSource:0}: Error finding container 4f7a00a71bb47ebc0f497edccce28d5b462dfb6f910a3b73f73c4fdced01a9ce: Status 404 returned error can't find the container with id 4f7a00a71bb47ebc0f497edccce28d5b462dfb6f910a3b73f73c4fdced01a9ce Sep 29 19:01:49 crc kubenswrapper[4780]: I0929 19:01:49.893690 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dc4b9644-4fsqf" event={"ID":"fd61d23a-532f-4ca8-aa16-396c1390d9fa","Type":"ContainerStarted","Data":"377ffcb711fd744856d7c100ebf3648b3f09038e8c8073e3e330c12633505712"} Sep 29 19:01:49 crc kubenswrapper[4780]: I0929 19:01:49.894178 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:49 crc kubenswrapper[4780]: I0929 19:01:49.900333 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6764d576f6-q7trv" event={"ID":"6c538b0f-23b3-440d-9775-5f33f7badfd4","Type":"ContainerStarted","Data":"4f7a00a71bb47ebc0f497edccce28d5b462dfb6f910a3b73f73c4fdced01a9ce"} Sep 29 19:01:49 crc kubenswrapper[4780]: I0929 19:01:49.902617 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7a98719-c1af-40eb-a2e2-b711001d277c","Type":"ContainerStarted","Data":"070e9830cfb317193a6f8fb2068ee4293313036cd87383713ef169bcda15daf5"} Sep 29 19:01:49 crc kubenswrapper[4780]: I0929 19:01:49.905877 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60211468-1dd1-4611-9009-cba4f4194aad","Type":"ContainerStarted","Data":"08627053627ce5aafd94b2c4e15902ccd61078df75d6d9838c0d8bdbdded8cd6"} Sep 29 19:01:49 crc kubenswrapper[4780]: I0929 19:01:49.922884 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55dc4b9644-4fsqf" podStartSLOduration=4.922864418 podStartE2EDuration="4.922864418s" podCreationTimestamp="2025-09-29 19:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:49.917040443 +0000 UTC m=+1109.865338527" watchObservedRunningTime="2025-09-29 19:01:49.922864418 +0000 UTC m=+1109.871162452" Sep 29 19:01:50 crc kubenswrapper[4780]: I0929 19:01:50.781793 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:51 crc kubenswrapper[4780]: I0929 19:01:51.927777 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7a98719-c1af-40eb-a2e2-b711001d277c","Type":"ContainerStarted","Data":"01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62"} Sep 29 19:01:51 crc kubenswrapper[4780]: I0929 19:01:51.932064 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60211468-1dd1-4611-9009-cba4f4194aad","Type":"ContainerStarted","Data":"05ffb8597183363a8b21649ea5f016f6dfe4b6c513c7f7168a08f4df425b429f"} Sep 29 19:01:51 crc kubenswrapper[4780]: I0929 19:01:51.934722 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6764d576f6-q7trv" event={"ID":"6c538b0f-23b3-440d-9775-5f33f7badfd4","Type":"ContainerStarted","Data":"0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1"} Sep 29 19:01:51 crc kubenswrapper[4780]: I0929 19:01:51.938240 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" event={"ID":"8d951f39-e623-4b94-ab75-c47c5ea91095","Type":"ContainerStarted","Data":"6c0bbe21fd077a25b983dab78246c494b9ec96d2f531166882168d02c8af6087"} Sep 29 19:01:51 crc kubenswrapper[4780]: I0929 19:01:51.938353 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:51 crc kubenswrapper[4780]: I0929 19:01:51.971790 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.97176827 podStartE2EDuration="6.97176827s" podCreationTimestamp="2025-09-29 19:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:51.962489758 +0000 UTC m=+1111.910788032" watchObservedRunningTime="2025-09-29 19:01:51.97176827 +0000 UTC m=+1111.920066314" Sep 29 19:01:52 crc kubenswrapper[4780]: I0929 19:01:52.000011 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" podStartSLOduration=6.999992789 podStartE2EDuration="6.999992789s" podCreationTimestamp="2025-09-29 19:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:01:51.997013714 +0000 UTC m=+1111.945311758" watchObservedRunningTime="2025-09-29 19:01:51.999992789 +0000 UTC m=+1111.948290833" Sep 29 19:01:53 crc kubenswrapper[4780]: I0929 19:01:53.984875 4780 generic.go:334] "Generic (PLEG): container finished" podID="2603fb51-f7e5-4212-a85a-2411175cd5d7" containerID="ada0d3f9808c1bbda5295b4e75f3aac1b8c137677fb1e9e078bfbd6a6f89a728" exitCode=0 Sep 29 19:01:53 crc kubenswrapper[4780]: I0929 19:01:53.984974 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-btjpk" event={"ID":"2603fb51-f7e5-4212-a85a-2411175cd5d7","Type":"ContainerDied","Data":"ada0d3f9808c1bbda5295b4e75f3aac1b8c137677fb1e9e078bfbd6a6f89a728"} Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.008490 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-btjpk" event={"ID":"2603fb51-f7e5-4212-a85a-2411175cd5d7","Type":"ContainerDied","Data":"a30369a57fd29a27ad05996a05ea923bf6249dab1bbd035f48e7af6c10274222"} Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.009011 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a30369a57fd29a27ad05996a05ea923bf6249dab1bbd035f48e7af6c10274222" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.014016 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.102356 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-config\") pod \"2603fb51-f7e5-4212-a85a-2411175cd5d7\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.102450 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb2dh\" (UniqueName: \"kubernetes.io/projected/2603fb51-f7e5-4212-a85a-2411175cd5d7-kube-api-access-jb2dh\") pod \"2603fb51-f7e5-4212-a85a-2411175cd5d7\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.102598 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-combined-ca-bundle\") pod \"2603fb51-f7e5-4212-a85a-2411175cd5d7\" (UID: \"2603fb51-f7e5-4212-a85a-2411175cd5d7\") " Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.115379 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2603fb51-f7e5-4212-a85a-2411175cd5d7-kube-api-access-jb2dh" (OuterVolumeSpecName: "kube-api-access-jb2dh") pod "2603fb51-f7e5-4212-a85a-2411175cd5d7" (UID: "2603fb51-f7e5-4212-a85a-2411175cd5d7"). InnerVolumeSpecName "kube-api-access-jb2dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.136741 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-config" (OuterVolumeSpecName: "config") pod "2603fb51-f7e5-4212-a85a-2411175cd5d7" (UID: "2603fb51-f7e5-4212-a85a-2411175cd5d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.137877 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.138854 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.152220 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2603fb51-f7e5-4212-a85a-2411175cd5d7" (UID: "2603fb51-f7e5-4212-a85a-2411175cd5d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.181730 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.202004 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.204691 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.204724 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2603fb51-f7e5-4212-a85a-2411175cd5d7-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:56 crc kubenswrapper[4780]: I0929 19:01:56.204738 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb2dh\" (UniqueName: \"kubernetes.io/projected/2603fb51-f7e5-4212-a85a-2411175cd5d7-kube-api-access-jb2dh\") on node \"crc\" DevicePath \"\"" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.018422 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-btjpk" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.018549 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.019057 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.205616 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-6szl6"] Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.205852 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" podUID="8d951f39-e623-4b94-ab75-c47c5ea91095" containerName="dnsmasq-dns" containerID="cri-o://6c0bbe21fd077a25b983dab78246c494b9ec96d2f531166882168d02c8af6087" gracePeriod=10 Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.211032 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.279594 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-fwkrr"] Sep 29 19:01:57 crc kubenswrapper[4780]: E0929 19:01:57.280014 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2603fb51-f7e5-4212-a85a-2411175cd5d7" containerName="neutron-db-sync" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.280025 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2603fb51-f7e5-4212-a85a-2411175cd5d7" containerName="neutron-db-sync" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.280287 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2603fb51-f7e5-4212-a85a-2411175cd5d7" containerName="neutron-db-sync" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.281291 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.286893 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-686fd87d4d-xmdcq"] Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.298426 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.303649 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hmghr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.303850 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.303975 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.304120 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.321561 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-fwkrr"] Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.334189 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-686fd87d4d-xmdcq"] Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435056 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-nb\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435349 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-config\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435375 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-swift-storage-0\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435394 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-sb\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435425 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-combined-ca-bundle\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435451 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqkgh\" (UniqueName: \"kubernetes.io/projected/c253bf8c-311c-4f8c-ba29-a7533cf02f42-kube-api-access-lqkgh\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435484 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-svc\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435506 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-httpd-config\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435538 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-ovndb-tls-certs\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435557 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-config\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.435610 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjtdg\" (UniqueName: \"kubernetes.io/projected/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-kube-api-access-rjtdg\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.542135 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-svc\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.542260 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-httpd-config\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.542328 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-ovndb-tls-certs\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.542365 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-config\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.543317 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjtdg\" (UniqueName: \"kubernetes.io/projected/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-kube-api-access-rjtdg\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.543529 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-nb\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.543607 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-config\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.543660 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-swift-storage-0\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.543696 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-sb\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.543812 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-combined-ca-bundle\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.543901 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqkgh\" (UniqueName: \"kubernetes.io/projected/c253bf8c-311c-4f8c-ba29-a7533cf02f42-kube-api-access-lqkgh\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.545678 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-nb\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.548804 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-svc\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.549450 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-swift-storage-0\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.549770 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-sb\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.554910 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-config\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.561252 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-httpd-config\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.562038 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-ovndb-tls-certs\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.578904 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-combined-ca-bundle\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.592718 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-config\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.594349 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjtdg\" (UniqueName: \"kubernetes.io/projected/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-kube-api-access-rjtdg\") pod \"neutron-686fd87d4d-xmdcq\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.594621 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqkgh\" (UniqueName: \"kubernetes.io/projected/c253bf8c-311c-4f8c-ba29-a7533cf02f42-kube-api-access-lqkgh\") pod \"dnsmasq-dns-548b47b48c-fwkrr\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.671959 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.701637 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:01:57 crc kubenswrapper[4780]: I0929 19:01:57.935596 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:58 crc kubenswrapper[4780]: I0929 19:01:58.041469 4780 generic.go:334] "Generic (PLEG): container finished" podID="8d951f39-e623-4b94-ab75-c47c5ea91095" containerID="6c0bbe21fd077a25b983dab78246c494b9ec96d2f531166882168d02c8af6087" exitCode=0 Sep 29 19:01:58 crc kubenswrapper[4780]: I0929 19:01:58.041530 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" event={"ID":"8d951f39-e623-4b94-ab75-c47c5ea91095","Type":"ContainerDied","Data":"6c0bbe21fd077a25b983dab78246c494b9ec96d2f531166882168d02c8af6087"} Sep 29 19:01:58 crc kubenswrapper[4780]: I0929 19:01:58.305270 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:01:59 crc kubenswrapper[4780]: I0929 19:01:59.844286 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:01:59 crc kubenswrapper[4780]: I0929 19:01:59.916094 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-sb\") pod \"8d951f39-e623-4b94-ab75-c47c5ea91095\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " Sep 29 19:01:59 crc kubenswrapper[4780]: I0929 19:01:59.916301 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48xs4\" (UniqueName: \"kubernetes.io/projected/8d951f39-e623-4b94-ab75-c47c5ea91095-kube-api-access-48xs4\") pod \"8d951f39-e623-4b94-ab75-c47c5ea91095\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " Sep 29 19:01:59 crc kubenswrapper[4780]: I0929 19:01:59.916437 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-config\") pod \"8d951f39-e623-4b94-ab75-c47c5ea91095\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " Sep 29 19:01:59 crc kubenswrapper[4780]: I0929 19:01:59.916555 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-nb\") pod \"8d951f39-e623-4b94-ab75-c47c5ea91095\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " Sep 29 19:01:59 crc kubenswrapper[4780]: I0929 19:01:59.916624 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-svc\") pod \"8d951f39-e623-4b94-ab75-c47c5ea91095\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " Sep 29 19:01:59 crc kubenswrapper[4780]: I0929 19:01:59.916657 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-swift-storage-0\") pod \"8d951f39-e623-4b94-ab75-c47c5ea91095\" (UID: \"8d951f39-e623-4b94-ab75-c47c5ea91095\") " Sep 29 19:01:59 crc kubenswrapper[4780]: I0929 19:01:59.947809 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d951f39-e623-4b94-ab75-c47c5ea91095-kube-api-access-48xs4" (OuterVolumeSpecName: "kube-api-access-48xs4") pod "8d951f39-e623-4b94-ab75-c47c5ea91095" (UID: "8d951f39-e623-4b94-ab75-c47c5ea91095"). InnerVolumeSpecName "kube-api-access-48xs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:01:59 crc kubenswrapper[4780]: I0929 19:01:59.981885 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.019762 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48xs4\" (UniqueName: \"kubernetes.io/projected/8d951f39-e623-4b94-ab75-c47c5ea91095-kube-api-access-48xs4\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.076725 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" event={"ID":"8d951f39-e623-4b94-ab75-c47c5ea91095","Type":"ContainerDied","Data":"26d1c0fde801840eef89b19003b55e2b1fbe251b89aae656a8cb816cce6def09"} Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.076777 4780 scope.go:117] "RemoveContainer" containerID="6c0bbe21fd077a25b983dab78246c494b9ec96d2f531166882168d02c8af6087" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.076918 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c856dc5f9-6szl6" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.086203 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" event={"ID":"8e1d2b75-0893-468d-8365-f08fa8875575","Type":"ContainerStarted","Data":"4cac85a12f0ad40a5dc9410707339b2f4a75fbc7d7e9f99310b24f564e2e4f03"} Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.340215 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-fwkrr"] Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.381373 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8d951f39-e623-4b94-ab75-c47c5ea91095" (UID: "8d951f39-e623-4b94-ab75-c47c5ea91095"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.427423 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.487811 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8d951f39-e623-4b94-ab75-c47c5ea91095" (UID: "8d951f39-e623-4b94-ab75-c47c5ea91095"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.505760 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-config" (OuterVolumeSpecName: "config") pod "8d951f39-e623-4b94-ab75-c47c5ea91095" (UID: "8d951f39-e623-4b94-ab75-c47c5ea91095"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:00 crc kubenswrapper[4780]: W0929 19:02:00.525257 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc253bf8c_311c_4f8c_ba29_a7533cf02f42.slice/crio-8b37a1bc08aa40ea5829412987ba01fa80e3dae6dc4ff339e98440af9dd6d27d WatchSource:0}: Error finding container 8b37a1bc08aa40ea5829412987ba01fa80e3dae6dc4ff339e98440af9dd6d27d: Status 404 returned error can't find the container with id 8b37a1bc08aa40ea5829412987ba01fa80e3dae6dc4ff339e98440af9dd6d27d Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.532918 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.532963 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.533698 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8d951f39-e623-4b94-ab75-c47c5ea91095" (UID: "8d951f39-e623-4b94-ab75-c47c5ea91095"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.548974 4780 scope.go:117] "RemoveContainer" containerID="9a922f3781c1132b2f38a1942407071ce610d76c6cadd603bf618999b74c58b5" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.587414 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.599097 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d951f39-e623-4b94-ab75-c47c5ea91095" (UID: "8d951f39-e623-4b94-ab75-c47c5ea91095"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.636667 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.636699 4780 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d951f39-e623-4b94-ab75-c47c5ea91095-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:00 crc kubenswrapper[4780]: I0929 19:02:00.812785 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-686fd87d4d-xmdcq"] Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.030526 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-6szl6"] Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.054194 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c856dc5f9-6szl6"] Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.136437 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6764d576f6-q7trv" event={"ID":"6c538b0f-23b3-440d-9775-5f33f7badfd4","Type":"ContainerStarted","Data":"968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c"} Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.137699 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.137721 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.151923 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6764d576f6-q7trv" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.156:9311/healthcheck\": dial tcp 10.217.0.156:9311: connect: connection refused" Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.152427 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" event={"ID":"c253bf8c-311c-4f8c-ba29-a7533cf02f42","Type":"ContainerStarted","Data":"8b37a1bc08aa40ea5829412987ba01fa80e3dae6dc4ff339e98440af9dd6d27d"} Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.170343 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686fd87d4d-xmdcq" event={"ID":"392fcdb5-646c-4fd3-b2cf-65ced169dfcf","Type":"ContainerStarted","Data":"9a76c11b9e730bd001245d655e4f0f6cf000be8580e41920c4206be241127d0b"} Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.182271 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6764d576f6-q7trv" podStartSLOduration=13.182248662 podStartE2EDuration="13.182248662s" podCreationTimestamp="2025-09-29 19:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:01.169981205 +0000 UTC m=+1121.118279249" watchObservedRunningTime="2025-09-29 19:02:01.182248662 +0000 UTC m=+1121.130546706" Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.185227 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74988cff4c-fmczd" event={"ID":"8150bb34-1bc0-4c45-92f8-9d8d04f611e3","Type":"ContainerStarted","Data":"4b807f34a3c65b6d836e3bd255f8320430de3cf2180ee8e33b572ba6e6717b3b"} Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.262590 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" podStartSLOduration=6.035013761 podStartE2EDuration="17.262556884s" podCreationTimestamp="2025-09-29 19:01:44 +0000 UTC" firstStartedPulling="2025-09-29 19:01:48.293872164 +0000 UTC m=+1108.242170208" lastFinishedPulling="2025-09-29 19:01:59.521415287 +0000 UTC m=+1119.469713331" observedRunningTime="2025-09-29 19:02:01.227988826 +0000 UTC m=+1121.176286870" watchObservedRunningTime="2025-09-29 19:02:01.262556884 +0000 UTC m=+1121.210854928" Sep 29 19:02:01 crc kubenswrapper[4780]: E0929 19:02:01.319201 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.502134 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:01 crc kubenswrapper[4780]: I0929 19:02:01.502280 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.070592 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d954bbbf5-jklnq"] Sep 29 19:02:02 crc kubenswrapper[4780]: E0929 19:02:02.071966 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d951f39-e623-4b94-ab75-c47c5ea91095" containerName="dnsmasq-dns" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.071987 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d951f39-e623-4b94-ab75-c47c5ea91095" containerName="dnsmasq-dns" Sep 29 19:02:02 crc kubenswrapper[4780]: E0929 19:02:02.072038 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d951f39-e623-4b94-ab75-c47c5ea91095" containerName="init" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.072048 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d951f39-e623-4b94-ab75-c47c5ea91095" containerName="init" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.072279 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d951f39-e623-4b94-ab75-c47c5ea91095" containerName="dnsmasq-dns" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.085696 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.092721 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.093789 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.126938 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d954bbbf5-jklnq"] Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.222464 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp2bw\" (UniqueName: \"kubernetes.io/projected/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-kube-api-access-cp2bw\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.222583 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-ovndb-tls-certs\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.222635 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-internal-tls-certs\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.222683 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-config\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.222704 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-httpd-config\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.222728 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-combined-ca-bundle\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.222785 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-public-tls-certs\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.223672 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" event={"ID":"8e1d2b75-0893-468d-8365-f08fa8875575","Type":"ContainerStarted","Data":"a3bd0c44347129dd3eb6a433ef0ca6e0cd25372b1057012a51a9c55da4d8ff4a"} Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.246965 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7a98719-c1af-40eb-a2e2-b711001d277c","Type":"ContainerStarted","Data":"0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101"} Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.257619 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rhnjt" event={"ID":"3723c568-a926-469d-bda8-99c2a0ed7095","Type":"ContainerStarted","Data":"0affd5ee1b6271df452bc20498cfdf2af34f4084addeda0fabf4b600b2de3adf"} Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.261565 4780 generic.go:334] "Generic (PLEG): container finished" podID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerID="8b720e49ca1d381276eb76f54b37e08c9dcf8df517560ea1d2da5ecf90760b1b" exitCode=0 Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.261704 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" event={"ID":"c253bf8c-311c-4f8c-ba29-a7533cf02f42","Type":"ContainerDied","Data":"8b720e49ca1d381276eb76f54b37e08c9dcf8df517560ea1d2da5ecf90760b1b"} Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.270521 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"229da81d-301a-46d2-892b-5ac9b0861ac1","Type":"ContainerStarted","Data":"cda7913fd0621ca1279d65abdace08c6c3cf4bea92aff1460d8a3de07e845571"} Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.270740 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="ceilometer-central-agent" containerID="cri-o://719bf0b0631d9e71a1709977874890c41e75b7a3ac1293fbcd3b57f90768e4a0" gracePeriod=30 Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.270832 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.270881 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="proxy-httpd" containerID="cri-o://cda7913fd0621ca1279d65abdace08c6c3cf4bea92aff1460d8a3de07e845571" gracePeriod=30 Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.270930 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="ceilometer-notification-agent" containerID="cri-o://53d83bf53ca2b73549125815bc42f1206cccfca72f7d048bc647722028b998e7" gracePeriod=30 Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.306728 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686fd87d4d-xmdcq" event={"ID":"392fcdb5-646c-4fd3-b2cf-65ced169dfcf","Type":"ContainerStarted","Data":"ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a"} Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.306794 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686fd87d4d-xmdcq" event={"ID":"392fcdb5-646c-4fd3-b2cf-65ced169dfcf","Type":"ContainerStarted","Data":"8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195"} Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.307878 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.317024 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74988cff4c-fmczd" event={"ID":"8150bb34-1bc0-4c45-92f8-9d8d04f611e3","Type":"ContainerStarted","Data":"b7c50fc2d9534221112ba9758fee8b52356d9efe5f5ed3fb8c0432498719f180"} Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.317998 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.317985751 podStartE2EDuration="15.317985751s" podCreationTimestamp="2025-09-29 19:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:02.315904592 +0000 UTC m=+1122.264202636" watchObservedRunningTime="2025-09-29 19:02:02.317985751 +0000 UTC m=+1122.266283795" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.325874 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-public-tls-certs\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.335293 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp2bw\" (UniqueName: \"kubernetes.io/projected/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-kube-api-access-cp2bw\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.335536 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-ovndb-tls-certs\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.335698 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-internal-tls-certs\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.335874 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-config\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.335914 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-httpd-config\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.335977 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-combined-ca-bundle\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.346071 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-config\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.352011 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-ovndb-tls-certs\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.353223 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-combined-ca-bundle\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.372848 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-public-tls-certs\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.372950 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-internal-tls-certs\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.380964 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-httpd-config\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.393045 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp2bw\" (UniqueName: \"kubernetes.io/projected/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-kube-api-access-cp2bw\") pod \"neutron-5d954bbbf5-jklnq\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.462595 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.562923 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rhnjt" podStartSLOduration=4.20652402 podStartE2EDuration="48.562896069s" podCreationTimestamp="2025-09-29 19:01:14 +0000 UTC" firstStartedPulling="2025-09-29 19:01:15.196433446 +0000 UTC m=+1075.144731490" lastFinishedPulling="2025-09-29 19:01:59.552805495 +0000 UTC m=+1119.501103539" observedRunningTime="2025-09-29 19:02:02.426808219 +0000 UTC m=+1122.375106283" watchObservedRunningTime="2025-09-29 19:02:02.562896069 +0000 UTC m=+1122.511194123" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.565168 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-686fd87d4d-xmdcq" podStartSLOduration=5.565134413 podStartE2EDuration="5.565134413s" podCreationTimestamp="2025-09-29 19:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:02.476221347 +0000 UTC m=+1122.424519421" watchObservedRunningTime="2025-09-29 19:02:02.565134413 +0000 UTC m=+1122.513432457" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.616436 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-74988cff4c-fmczd" podStartSLOduration=6.923708291 podStartE2EDuration="18.616413283s" podCreationTimestamp="2025-09-29 19:01:44 +0000 UTC" firstStartedPulling="2025-09-29 19:01:47.828263532 +0000 UTC m=+1107.776561586" lastFinishedPulling="2025-09-29 19:01:59.520968524 +0000 UTC m=+1119.469266578" observedRunningTime="2025-09-29 19:02:02.51484596 +0000 UTC m=+1122.463144004" watchObservedRunningTime="2025-09-29 19:02:02.616413283 +0000 UTC m=+1122.564711327" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.622653 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:02:02 crc kubenswrapper[4780]: I0929 19:02:02.675972 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.164456 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d951f39-e623-4b94-ab75-c47c5ea91095" path="/var/lib/kubelet/pods/8d951f39-e623-4b94-ab75-c47c5ea91095/volumes" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.231330 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.231407 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.231464 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.232403 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f026a57b468a10b5696a1d13800dd6d4186b4cd22425cdfb1197806a9210b5dc"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.232463 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://f026a57b468a10b5696a1d13800dd6d4186b4cd22425cdfb1197806a9210b5dc" gracePeriod=600 Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.346741 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" event={"ID":"c253bf8c-311c-4f8c-ba29-a7533cf02f42","Type":"ContainerStarted","Data":"4aebdec038dbac143882c5c80de1a492b75c0c952e0ef360a640daa9211caf5d"} Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.348772 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.360039 4780 generic.go:334] "Generic (PLEG): container finished" podID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerID="cda7913fd0621ca1279d65abdace08c6c3cf4bea92aff1460d8a3de07e845571" exitCode=0 Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.360413 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"229da81d-301a-46d2-892b-5ac9b0861ac1","Type":"ContainerDied","Data":"cda7913fd0621ca1279d65abdace08c6c3cf4bea92aff1460d8a3de07e845571"} Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.402660 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" podStartSLOduration=6.402626875 podStartE2EDuration="6.402626875s" podCreationTimestamp="2025-09-29 19:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:03.380804368 +0000 UTC m=+1123.329102442" watchObservedRunningTime="2025-09-29 19:02:03.402626875 +0000 UTC m=+1123.350924919" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.476067 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d954bbbf5-jklnq"] Sep 29 19:02:03 crc kubenswrapper[4780]: W0929 19:02:03.491957 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeb1a5cd_ae45_4e38_bcf7_40cd5a76b7d8.slice/crio-198fd34207081e7daec72f7217803b559a57952b4bf8e4b0c5ece602f309f027 WatchSource:0}: Error finding container 198fd34207081e7daec72f7217803b559a57952b4bf8e4b0c5ece602f309f027: Status 404 returned error can't find the container with id 198fd34207081e7daec72f7217803b559a57952b4bf8e4b0c5ece602f309f027 Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.758841 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.760470 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.764226 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-bnqkm" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.764442 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.764656 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.781820 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.845449 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c55lh\" (UniqueName: \"kubernetes.io/projected/3608c7b9-1f29-491f-9a10-48135b074fa4-kube-api-access-c55lh\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.845905 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config-secret\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.845949 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.846007 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.950618 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.952397 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c55lh\" (UniqueName: \"kubernetes.io/projected/3608c7b9-1f29-491f-9a10-48135b074fa4-kube-api-access-c55lh\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.952580 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config-secret\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.952622 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.957705 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.962621 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.971775 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config-secret\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:03 crc kubenswrapper[4780]: I0929 19:02:03.982750 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c55lh\" (UniqueName: \"kubernetes.io/projected/3608c7b9-1f29-491f-9a10-48135b074fa4-kube-api-access-c55lh\") pod \"openstackclient\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " pod="openstack/openstackclient" Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.177068 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.439363 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="f026a57b468a10b5696a1d13800dd6d4186b4cd22425cdfb1197806a9210b5dc" exitCode=0 Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.440216 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"f026a57b468a10b5696a1d13800dd6d4186b4cd22425cdfb1197806a9210b5dc"} Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.442488 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"158b296bb0b637f86ad18136c175af2360d991a7d6ae9ac64ec4dd848661493a"} Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.442514 4780 scope.go:117] "RemoveContainer" containerID="b940a355395049d621b81f1ec2d095c7832b21f04570b0b8f54122a46e556f20" Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.463839 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d954bbbf5-jklnq" event={"ID":"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8","Type":"ContainerStarted","Data":"3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92"} Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.463919 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d954bbbf5-jklnq" event={"ID":"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8","Type":"ContainerStarted","Data":"3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d"} Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.463934 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d954bbbf5-jklnq" event={"ID":"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8","Type":"ContainerStarted","Data":"198fd34207081e7daec72f7217803b559a57952b4bf8e4b0c5ece602f309f027"} Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.465305 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.472194 4780 generic.go:334] "Generic (PLEG): container finished" podID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerID="719bf0b0631d9e71a1709977874890c41e75b7a3ac1293fbcd3b57f90768e4a0" exitCode=0 Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.472746 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"229da81d-301a-46d2-892b-5ac9b0861ac1","Type":"ContainerDied","Data":"719bf0b0631d9e71a1709977874890c41e75b7a3ac1293fbcd3b57f90768e4a0"} Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.533316 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d954bbbf5-jklnq" podStartSLOduration=2.533290202 podStartE2EDuration="2.533290202s" podCreationTimestamp="2025-09-29 19:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:04.504724664 +0000 UTC m=+1124.453022698" watchObservedRunningTime="2025-09-29 19:02:04.533290202 +0000 UTC m=+1124.481588246" Sep 29 19:02:04 crc kubenswrapper[4780]: I0929 19:02:04.568781 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 19:02:05 crc kubenswrapper[4780]: I0929 19:02:05.499179 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3608c7b9-1f29-491f-9a10-48135b074fa4","Type":"ContainerStarted","Data":"7f41bb22645ff4a92610426797af71d222568ead52d9375831ad95c43fabc4d8"} Sep 29 19:02:05 crc kubenswrapper[4780]: I0929 19:02:05.834173 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.519668 4780 generic.go:334] "Generic (PLEG): container finished" podID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerID="53d83bf53ca2b73549125815bc42f1206cccfca72f7d048bc647722028b998e7" exitCode=0 Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.520760 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"229da81d-301a-46d2-892b-5ac9b0861ac1","Type":"ContainerDied","Data":"53d83bf53ca2b73549125815bc42f1206cccfca72f7d048bc647722028b998e7"} Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.777755 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.823891 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-sg-core-conf-yaml\") pod \"229da81d-301a-46d2-892b-5ac9b0861ac1\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.824009 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-log-httpd\") pod \"229da81d-301a-46d2-892b-5ac9b0861ac1\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.824136 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-scripts\") pod \"229da81d-301a-46d2-892b-5ac9b0861ac1\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.824212 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-run-httpd\") pod \"229da81d-301a-46d2-892b-5ac9b0861ac1\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.824240 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-combined-ca-bundle\") pod \"229da81d-301a-46d2-892b-5ac9b0861ac1\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.824290 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qkpt\" (UniqueName: \"kubernetes.io/projected/229da81d-301a-46d2-892b-5ac9b0861ac1-kube-api-access-2qkpt\") pod \"229da81d-301a-46d2-892b-5ac9b0861ac1\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.824447 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-config-data\") pod \"229da81d-301a-46d2-892b-5ac9b0861ac1\" (UID: \"229da81d-301a-46d2-892b-5ac9b0861ac1\") " Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.830165 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "229da81d-301a-46d2-892b-5ac9b0861ac1" (UID: "229da81d-301a-46d2-892b-5ac9b0861ac1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.830485 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "229da81d-301a-46d2-892b-5ac9b0861ac1" (UID: "229da81d-301a-46d2-892b-5ac9b0861ac1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.836186 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "229da81d-301a-46d2-892b-5ac9b0861ac1" (UID: "229da81d-301a-46d2-892b-5ac9b0861ac1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.837184 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-scripts" (OuterVolumeSpecName: "scripts") pod "229da81d-301a-46d2-892b-5ac9b0861ac1" (UID: "229da81d-301a-46d2-892b-5ac9b0861ac1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.843239 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/229da81d-301a-46d2-892b-5ac9b0861ac1-kube-api-access-2qkpt" (OuterVolumeSpecName: "kube-api-access-2qkpt") pod "229da81d-301a-46d2-892b-5ac9b0861ac1" (UID: "229da81d-301a-46d2-892b-5ac9b0861ac1"). InnerVolumeSpecName "kube-api-access-2qkpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.926690 4780 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.926731 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.926741 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.926750 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/229da81d-301a-46d2-892b-5ac9b0861ac1-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.926761 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qkpt\" (UniqueName: \"kubernetes.io/projected/229da81d-301a-46d2-892b-5ac9b0861ac1-kube-api-access-2qkpt\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:06 crc kubenswrapper[4780]: I0929 19:02:06.973159 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "229da81d-301a-46d2-892b-5ac9b0861ac1" (UID: "229da81d-301a-46d2-892b-5ac9b0861ac1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.018767 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-config-data" (OuterVolumeSpecName: "config-data") pod "229da81d-301a-46d2-892b-5ac9b0861ac1" (UID: "229da81d-301a-46d2-892b-5ac9b0861ac1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.030317 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.030350 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229da81d-301a-46d2-892b-5ac9b0861ac1-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.532240 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"229da81d-301a-46d2-892b-5ac9b0861ac1","Type":"ContainerDied","Data":"21bb7ad014be7240b97cf79e0c08c9685ee395c6724cb70aa3fa08ff483f2280"} Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.532318 4780 scope.go:117] "RemoveContainer" containerID="cda7913fd0621ca1279d65abdace08c6c3cf4bea92aff1460d8a3de07e845571" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.532325 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.563717 4780 scope.go:117] "RemoveContainer" containerID="53d83bf53ca2b73549125815bc42f1206cccfca72f7d048bc647722028b998e7" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.640106 4780 scope.go:117] "RemoveContainer" containerID="719bf0b0631d9e71a1709977874890c41e75b7a3ac1293fbcd3b57f90768e4a0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.651546 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.665745 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.677146 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:07 crc kubenswrapper[4780]: E0929 19:02:07.677671 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="ceilometer-notification-agent" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.677684 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="ceilometer-notification-agent" Sep 29 19:02:07 crc kubenswrapper[4780]: E0929 19:02:07.677703 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="ceilometer-central-agent" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.677711 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="ceilometer-central-agent" Sep 29 19:02:07 crc kubenswrapper[4780]: E0929 19:02:07.677753 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="proxy-httpd" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.677760 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="proxy-httpd" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.679170 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="proxy-httpd" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.679200 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="ceilometer-central-agent" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.679224 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" containerName="ceilometer-notification-agent" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.699474 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.703885 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.708579 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.711238 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.746580 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.746714 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.748703 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-scripts\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.748794 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-log-httpd\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.748848 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-run-httpd\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.748972 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-config-data\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.748998 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmrsv\" (UniqueName: \"kubernetes.io/projected/b15b3eff-0997-4228-af66-1f9caecc40bc-kube-api-access-gmrsv\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.852703 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-scripts\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.852778 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-log-httpd\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.852809 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-run-httpd\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.852854 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-config-data\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.852873 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmrsv\" (UniqueName: \"kubernetes.io/projected/b15b3eff-0997-4228-af66-1f9caecc40bc-kube-api-access-gmrsv\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.852941 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.853004 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.855511 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-log-httpd\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.855771 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-run-httpd\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.865434 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.875830 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.880719 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-scripts\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.881993 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-config-data\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:07 crc kubenswrapper[4780]: I0929 19:02:07.885664 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmrsv\" (UniqueName: \"kubernetes.io/projected/b15b3eff-0997-4228-af66-1f9caecc40bc-kube-api-access-gmrsv\") pod \"ceilometer-0\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " pod="openstack/ceilometer-0" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.037156 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.207448 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.207511 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.324472 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.339044 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.545269 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.545406 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.600395 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.703938 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55dc4b9644-4fsqf"] Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.705182 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55dc4b9644-4fsqf" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api-log" containerID="cri-o://2d6c0e2efc44fbbe01f7ffc27d4adeac10c4f1be0262bfbba5f0c30472f1b3c5" gracePeriod=30 Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.705924 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55dc4b9644-4fsqf" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api" containerID="cri-o://377ffcb711fd744856d7c100ebf3648b3f09038e8c8073e3e330c12633505712" gracePeriod=30 Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.793951 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="229da81d-301a-46d2-892b-5ac9b0861ac1" path="/var/lib/kubelet/pods/229da81d-301a-46d2-892b-5ac9b0861ac1/volumes" Sep 29 19:02:08 crc kubenswrapper[4780]: I0929 19:02:08.796483 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:08 crc kubenswrapper[4780]: W0929 19:02:08.838397 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb15b3eff_0997_4228_af66_1f9caecc40bc.slice/crio-004214a5b6b5b5995acea722bd8c6ffb32c188e008ed493e304902d6eed229a7 WatchSource:0}: Error finding container 004214a5b6b5b5995acea722bd8c6ffb32c188e008ed493e304902d6eed229a7: Status 404 returned error can't find the container with id 004214a5b6b5b5995acea722bd8c6ffb32c188e008ed493e304902d6eed229a7 Sep 29 19:02:09 crc kubenswrapper[4780]: I0929 19:02:09.559147 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerStarted","Data":"004214a5b6b5b5995acea722bd8c6ffb32c188e008ed493e304902d6eed229a7"} Sep 29 19:02:09 crc kubenswrapper[4780]: I0929 19:02:09.569858 4780 generic.go:334] "Generic (PLEG): container finished" podID="3723c568-a926-469d-bda8-99c2a0ed7095" containerID="0affd5ee1b6271df452bc20498cfdf2af34f4084addeda0fabf4b600b2de3adf" exitCode=0 Sep 29 19:02:09 crc kubenswrapper[4780]: I0929 19:02:09.569946 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rhnjt" event={"ID":"3723c568-a926-469d-bda8-99c2a0ed7095","Type":"ContainerDied","Data":"0affd5ee1b6271df452bc20498cfdf2af34f4084addeda0fabf4b600b2de3adf"} Sep 29 19:02:09 crc kubenswrapper[4780]: I0929 19:02:09.582321 4780 generic.go:334] "Generic (PLEG): container finished" podID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerID="2d6c0e2efc44fbbe01f7ffc27d4adeac10c4f1be0262bfbba5f0c30472f1b3c5" exitCode=143 Sep 29 19:02:09 crc kubenswrapper[4780]: I0929 19:02:09.583380 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dc4b9644-4fsqf" event={"ID":"fd61d23a-532f-4ca8-aa16-396c1390d9fa","Type":"ContainerDied","Data":"2d6c0e2efc44fbbe01f7ffc27d4adeac10c4f1be0262bfbba5f0c30472f1b3c5"} Sep 29 19:02:10 crc kubenswrapper[4780]: I0929 19:02:10.603430 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerStarted","Data":"15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a"} Sep 29 19:02:10 crc kubenswrapper[4780]: I0929 19:02:10.603758 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerStarted","Data":"151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f"} Sep 29 19:02:10 crc kubenswrapper[4780]: I0929 19:02:10.603510 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 19:02:10 crc kubenswrapper[4780]: I0929 19:02:10.603788 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.221854 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.256535 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-combined-ca-bundle\") pod \"3723c568-a926-469d-bda8-99c2a0ed7095\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.256639 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-scripts\") pod \"3723c568-a926-469d-bda8-99c2a0ed7095\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.256666 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-db-sync-config-data\") pod \"3723c568-a926-469d-bda8-99c2a0ed7095\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.256711 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-config-data\") pod \"3723c568-a926-469d-bda8-99c2a0ed7095\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.256736 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t826s\" (UniqueName: \"kubernetes.io/projected/3723c568-a926-469d-bda8-99c2a0ed7095-kube-api-access-t826s\") pod \"3723c568-a926-469d-bda8-99c2a0ed7095\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.256776 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3723c568-a926-469d-bda8-99c2a0ed7095-etc-machine-id\") pod \"3723c568-a926-469d-bda8-99c2a0ed7095\" (UID: \"3723c568-a926-469d-bda8-99c2a0ed7095\") " Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.258176 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3723c568-a926-469d-bda8-99c2a0ed7095-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3723c568-a926-469d-bda8-99c2a0ed7095" (UID: "3723c568-a926-469d-bda8-99c2a0ed7095"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.266949 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-scripts" (OuterVolumeSpecName: "scripts") pod "3723c568-a926-469d-bda8-99c2a0ed7095" (UID: "3723c568-a926-469d-bda8-99c2a0ed7095"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.273254 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3723c568-a926-469d-bda8-99c2a0ed7095" (UID: "3723c568-a926-469d-bda8-99c2a0ed7095"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.273327 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3723c568-a926-469d-bda8-99c2a0ed7095-kube-api-access-t826s" (OuterVolumeSpecName: "kube-api-access-t826s") pod "3723c568-a926-469d-bda8-99c2a0ed7095" (UID: "3723c568-a926-469d-bda8-99c2a0ed7095"). InnerVolumeSpecName "kube-api-access-t826s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.319153 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3723c568-a926-469d-bda8-99c2a0ed7095" (UID: "3723c568-a926-469d-bda8-99c2a0ed7095"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.393896 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.393931 4780 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.393943 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t826s\" (UniqueName: \"kubernetes.io/projected/3723c568-a926-469d-bda8-99c2a0ed7095-kube-api-access-t826s\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.393953 4780 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3723c568-a926-469d-bda8-99c2a0ed7095-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.393962 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.394619 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.395713 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.413159 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-config-data" (OuterVolumeSpecName: "config-data") pod "3723c568-a926-469d-bda8-99c2a0ed7095" (UID: "3723c568-a926-469d-bda8-99c2a0ed7095"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.495787 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3723c568-a926-469d-bda8-99c2a0ed7095-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.627910 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rhnjt" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.628356 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rhnjt" event={"ID":"3723c568-a926-469d-bda8-99c2a0ed7095","Type":"ContainerDied","Data":"d11bad14f678cb07017d151b28f0cc3f55a6e91a28db9fed10eb65a60cfbaa40"} Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.628406 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d11bad14f678cb07017d151b28f0cc3f55a6e91a28db9fed10eb65a60cfbaa40" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.914491 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:02:11 crc kubenswrapper[4780]: E0929 19:02:11.914897 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3723c568-a926-469d-bda8-99c2a0ed7095" containerName="cinder-db-sync" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.914920 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3723c568-a926-469d-bda8-99c2a0ed7095" containerName="cinder-db-sync" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.919204 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3723c568-a926-469d-bda8-99c2a0ed7095" containerName="cinder-db-sync" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.920387 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.942510 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.942731 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nmjgd" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.942846 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.942988 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 29 19:02:11 crc kubenswrapper[4780]: I0929 19:02:11.942989 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.006668 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.006758 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8j6\" (UniqueName: \"kubernetes.io/projected/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-kube-api-access-zq8j6\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.007437 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.007495 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.007662 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.007705 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.054671 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-fwkrr"] Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.054991 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" podUID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerName="dnsmasq-dns" containerID="cri-o://4aebdec038dbac143882c5c80de1a492b75c0c952e0ef360a640daa9211caf5d" gracePeriod=10 Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.056746 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.105700 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-6bb74"] Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.107456 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.109181 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.109239 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.109298 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.109360 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8j6\" (UniqueName: \"kubernetes.io/projected/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-kube-api-access-zq8j6\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.109393 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.109432 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.111529 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.122165 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.127519 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.138705 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.139098 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.153702 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-6bb74"] Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.167012 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8j6\" (UniqueName: \"kubernetes.io/projected/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-kube-api-access-zq8j6\") pod \"cinder-scheduler-0\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.217497 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-swift-storage-0\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.218101 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-config\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.218177 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-sb\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.218236 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc2xr\" (UniqueName: \"kubernetes.io/projected/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-kube-api-access-pc2xr\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.218267 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.218325 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-svc\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.253716 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.256379 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.277442 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.289708 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.294423 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.321383 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.321457 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-sb\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.321503 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc2xr\" (UniqueName: \"kubernetes.io/projected/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-kube-api-access-pc2xr\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.321525 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.321814 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-svc\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.321913 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4d2b71b-300c-4567-9cb6-9223ceeaef37-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.321946 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-swift-storage-0\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.322016 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.322055 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.322110 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d2b71b-300c-4567-9cb6-9223ceeaef37-logs\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.322135 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-scripts\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.322204 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-config\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.322225 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb97x\" (UniqueName: \"kubernetes.io/projected/a4d2b71b-300c-4567-9cb6-9223ceeaef37-kube-api-access-wb97x\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.323277 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-svc\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.323990 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-swift-storage-0\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.326523 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-sb\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.328755 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.335501 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-config\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.356441 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc2xr\" (UniqueName: \"kubernetes.io/projected/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-kube-api-access-pc2xr\") pod \"dnsmasq-dns-6c47bb5d77-6bb74\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.425288 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4d2b71b-300c-4567-9cb6-9223ceeaef37-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.425345 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.425366 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.425391 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d2b71b-300c-4567-9cb6-9223ceeaef37-logs\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.425408 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-scripts\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.425438 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb97x\" (UniqueName: \"kubernetes.io/projected/a4d2b71b-300c-4567-9cb6-9223ceeaef37-kube-api-access-wb97x\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.425470 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.427679 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4d2b71b-300c-4567-9cb6-9223ceeaef37-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.427995 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d2b71b-300c-4567-9cb6-9223ceeaef37-logs\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.429946 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.430286 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-scripts\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.431178 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.447367 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.449750 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb97x\" (UniqueName: \"kubernetes.io/projected/a4d2b71b-300c-4567-9cb6-9223ceeaef37-kube-api-access-wb97x\") pod \"cinder-api-0\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.450141 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55dc4b9644-4fsqf" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:53792->10.217.0.153:9311: read: connection reset by peer" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.450642 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55dc4b9644-4fsqf" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:53778->10.217.0.153:9311: read: connection reset by peer" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.594915 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.647258 4780 generic.go:334] "Generic (PLEG): container finished" podID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerID="377ffcb711fd744856d7c100ebf3648b3f09038e8c8073e3e330c12633505712" exitCode=0 Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.647350 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dc4b9644-4fsqf" event={"ID":"fd61d23a-532f-4ca8-aa16-396c1390d9fa","Type":"ContainerDied","Data":"377ffcb711fd744856d7c100ebf3648b3f09038e8c8073e3e330c12633505712"} Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.650444 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.657304 4780 generic.go:334] "Generic (PLEG): container finished" podID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerID="4aebdec038dbac143882c5c80de1a492b75c0c952e0ef360a640daa9211caf5d" exitCode=0 Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.658632 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" event={"ID":"c253bf8c-311c-4f8c-ba29-a7533cf02f42","Type":"ContainerDied","Data":"4aebdec038dbac143882c5c80de1a492b75c0c952e0ef360a640daa9211caf5d"} Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.673022 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" podUID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: connect: connection refused" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.849658 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-58b5d8cc69-dbww7"] Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.851485 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58b5d8cc69-dbww7"] Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.851896 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.861623 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.862233 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.862340 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.940019 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-etc-swift\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.940480 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm2hj\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-kube-api-access-jm2hj\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.940640 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-config-data\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.940686 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-combined-ca-bundle\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.940796 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-log-httpd\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.940887 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-internal-tls-certs\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.940929 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-run-httpd\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.941161 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-public-tls-certs\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:12 crc kubenswrapper[4780]: I0929 19:02:12.987154 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.046364 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-internal-tls-certs\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.046478 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-run-httpd\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.046561 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-public-tls-certs\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.046633 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-etc-swift\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.046764 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm2hj\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-kube-api-access-jm2hj\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.046807 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-config-data\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.046831 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-combined-ca-bundle\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.046869 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-log-httpd\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.047359 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-run-httpd\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.047435 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-log-httpd\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.054034 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-internal-tls-certs\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.054691 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-public-tls-certs\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.055493 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-config-data\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.056691 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-combined-ca-bundle\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.070996 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm2hj\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-kube-api-access-jm2hj\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.075782 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-etc-swift\") pod \"swift-proxy-58b5d8cc69-dbww7\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:13 crc kubenswrapper[4780]: I0929 19:02:13.189151 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:14 crc kubenswrapper[4780]: I0929 19:02:14.479949 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:02:15 crc kubenswrapper[4780]: I0929 19:02:15.229994 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:02:15 crc kubenswrapper[4780]: I0929 19:02:15.230272 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-log" containerID="cri-o://08627053627ce5aafd94b2c4e15902ccd61078df75d6d9838c0d8bdbdded8cd6" gracePeriod=30 Sep 29 19:02:15 crc kubenswrapper[4780]: I0929 19:02:15.230412 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-httpd" containerID="cri-o://05ffb8597183363a8b21649ea5f016f6dfe4b6c513c7f7168a08f4df425b429f" gracePeriod=30 Sep 29 19:02:15 crc kubenswrapper[4780]: I0929 19:02:15.695933 4780 generic.go:334] "Generic (PLEG): container finished" podID="60211468-1dd1-4611-9009-cba4f4194aad" containerID="08627053627ce5aafd94b2c4e15902ccd61078df75d6d9838c0d8bdbdded8cd6" exitCode=143 Sep 29 19:02:15 crc kubenswrapper[4780]: I0929 19:02:15.696029 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60211468-1dd1-4611-9009-cba4f4194aad","Type":"ContainerDied","Data":"08627053627ce5aafd94b2c4e15902ccd61078df75d6d9838c0d8bdbdded8cd6"} Sep 29 19:02:15 crc kubenswrapper[4780]: I0929 19:02:15.775687 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55dc4b9644-4fsqf" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Sep 29 19:02:15 crc kubenswrapper[4780]: I0929 19:02:15.775691 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55dc4b9644-4fsqf" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Sep 29 19:02:17 crc kubenswrapper[4780]: I0929 19:02:17.673930 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" podUID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: connect: connection refused" Sep 29 19:02:18 crc kubenswrapper[4780]: I0929 19:02:18.391438 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:41012->10.217.0.154:9292: read: connection reset by peer" Sep 29 19:02:18 crc kubenswrapper[4780]: I0929 19:02:18.391445 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:41004->10.217.0.154:9292: read: connection reset by peer" Sep 29 19:02:18 crc kubenswrapper[4780]: I0929 19:02:18.735666 4780 generic.go:334] "Generic (PLEG): container finished" podID="60211468-1dd1-4611-9009-cba4f4194aad" containerID="05ffb8597183363a8b21649ea5f016f6dfe4b6c513c7f7168a08f4df425b429f" exitCode=0 Sep 29 19:02:18 crc kubenswrapper[4780]: I0929 19:02:18.735752 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60211468-1dd1-4611-9009-cba4f4194aad","Type":"ContainerDied","Data":"05ffb8597183363a8b21649ea5f016f6dfe4b6c513c7f7168a08f4df425b429f"} Sep 29 19:02:19 crc kubenswrapper[4780]: I0929 19:02:19.878873 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:02:19 crc kubenswrapper[4780]: I0929 19:02:19.917839 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-swift-storage-0\") pod \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " Sep 29 19:02:19 crc kubenswrapper[4780]: I0929 19:02:19.917901 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-svc\") pod \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " Sep 29 19:02:19 crc kubenswrapper[4780]: I0929 19:02:19.918038 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqkgh\" (UniqueName: \"kubernetes.io/projected/c253bf8c-311c-4f8c-ba29-a7533cf02f42-kube-api-access-lqkgh\") pod \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " Sep 29 19:02:19 crc kubenswrapper[4780]: I0929 19:02:19.918153 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-nb\") pod \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " Sep 29 19:02:19 crc kubenswrapper[4780]: I0929 19:02:19.918326 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-sb\") pod \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " Sep 29 19:02:19 crc kubenswrapper[4780]: I0929 19:02:19.918352 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-config\") pod \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\" (UID: \"c253bf8c-311c-4f8c-ba29-a7533cf02f42\") " Sep 29 19:02:19 crc kubenswrapper[4780]: I0929 19:02:19.943805 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c253bf8c-311c-4f8c-ba29-a7533cf02f42-kube-api-access-lqkgh" (OuterVolumeSpecName: "kube-api-access-lqkgh") pod "c253bf8c-311c-4f8c-ba29-a7533cf02f42" (UID: "c253bf8c-311c-4f8c-ba29-a7533cf02f42"). InnerVolumeSpecName "kube-api-access-lqkgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:19 crc kubenswrapper[4780]: I0929 19:02:19.985996 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c253bf8c-311c-4f8c-ba29-a7533cf02f42" (UID: "c253bf8c-311c-4f8c-ba29-a7533cf02f42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.019908 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c253bf8c-311c-4f8c-ba29-a7533cf02f42" (UID: "c253bf8c-311c-4f8c-ba29-a7533cf02f42"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.021811 4780 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.021849 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.021859 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqkgh\" (UniqueName: \"kubernetes.io/projected/c253bf8c-311c-4f8c-ba29-a7533cf02f42-kube-api-access-lqkgh\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.022265 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c253bf8c-311c-4f8c-ba29-a7533cf02f42" (UID: "c253bf8c-311c-4f8c-ba29-a7533cf02f42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.032067 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c253bf8c-311c-4f8c-ba29-a7533cf02f42" (UID: "c253bf8c-311c-4f8c-ba29-a7533cf02f42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.063985 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-config" (OuterVolumeSpecName: "config") pod "c253bf8c-311c-4f8c-ba29-a7533cf02f42" (UID: "c253bf8c-311c-4f8c-ba29-a7533cf02f42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.091885 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.103570 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125119 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-combined-ca-bundle\") pod \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125240 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-logs\") pod \"60211468-1dd1-4611-9009-cba4f4194aad\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125278 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"60211468-1dd1-4611-9009-cba4f4194aad\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125333 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-httpd-run\") pod \"60211468-1dd1-4611-9009-cba4f4194aad\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125357 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-scripts\") pod \"60211468-1dd1-4611-9009-cba4f4194aad\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125429 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqn26\" (UniqueName: \"kubernetes.io/projected/fd61d23a-532f-4ca8-aa16-396c1390d9fa-kube-api-access-cqn26\") pod \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125458 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data\") pod \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125516 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd61d23a-532f-4ca8-aa16-396c1390d9fa-logs\") pod \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125542 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pnvb\" (UniqueName: \"kubernetes.io/projected/60211468-1dd1-4611-9009-cba4f4194aad-kube-api-access-6pnvb\") pod \"60211468-1dd1-4611-9009-cba4f4194aad\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125615 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-config-data\") pod \"60211468-1dd1-4611-9009-cba4f4194aad\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125645 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data-custom\") pod \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\" (UID: \"fd61d23a-532f-4ca8-aa16-396c1390d9fa\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125752 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-internal-tls-certs\") pod \"60211468-1dd1-4611-9009-cba4f4194aad\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.125773 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-combined-ca-bundle\") pod \"60211468-1dd1-4611-9009-cba4f4194aad\" (UID: \"60211468-1dd1-4611-9009-cba4f4194aad\") " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.126377 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.126396 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.126409 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c253bf8c-311c-4f8c-ba29-a7533cf02f42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.134765 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd61d23a-532f-4ca8-aa16-396c1390d9fa-logs" (OuterVolumeSpecName: "logs") pod "fd61d23a-532f-4ca8-aa16-396c1390d9fa" (UID: "fd61d23a-532f-4ca8-aa16-396c1390d9fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.135585 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "60211468-1dd1-4611-9009-cba4f4194aad" (UID: "60211468-1dd1-4611-9009-cba4f4194aad"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.136598 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fd61d23a-532f-4ca8-aa16-396c1390d9fa" (UID: "fd61d23a-532f-4ca8-aa16-396c1390d9fa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.136953 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-logs" (OuterVolumeSpecName: "logs") pod "60211468-1dd1-4611-9009-cba4f4194aad" (UID: "60211468-1dd1-4611-9009-cba4f4194aad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.146664 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60211468-1dd1-4611-9009-cba4f4194aad-kube-api-access-6pnvb" (OuterVolumeSpecName: "kube-api-access-6pnvb") pod "60211468-1dd1-4611-9009-cba4f4194aad" (UID: "60211468-1dd1-4611-9009-cba4f4194aad"). InnerVolumeSpecName "kube-api-access-6pnvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.163733 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd61d23a-532f-4ca8-aa16-396c1390d9fa-kube-api-access-cqn26" (OuterVolumeSpecName: "kube-api-access-cqn26") pod "fd61d23a-532f-4ca8-aa16-396c1390d9fa" (UID: "fd61d23a-532f-4ca8-aa16-396c1390d9fa"). InnerVolumeSpecName "kube-api-access-cqn26". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.170281 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-scripts" (OuterVolumeSpecName: "scripts") pod "60211468-1dd1-4611-9009-cba4f4194aad" (UID: "60211468-1dd1-4611-9009-cba4f4194aad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.189689 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "60211468-1dd1-4611-9009-cba4f4194aad" (UID: "60211468-1dd1-4611-9009-cba4f4194aad"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.233433 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pnvb\" (UniqueName: \"kubernetes.io/projected/60211468-1dd1-4611-9009-cba4f4194aad-kube-api-access-6pnvb\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.233483 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.233495 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.233530 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.233543 4780 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60211468-1dd1-4611-9009-cba4f4194aad-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.233555 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.233567 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqn26\" (UniqueName: \"kubernetes.io/projected/fd61d23a-532f-4ca8-aa16-396c1390d9fa-kube-api-access-cqn26\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.233579 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd61d23a-532f-4ca8-aa16-396c1390d9fa-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.247501 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd61d23a-532f-4ca8-aa16-396c1390d9fa" (UID: "fd61d23a-532f-4ca8-aa16-396c1390d9fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.251670 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data" (OuterVolumeSpecName: "config-data") pod "fd61d23a-532f-4ca8-aa16-396c1390d9fa" (UID: "fd61d23a-532f-4ca8-aa16-396c1390d9fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.254983 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.255345 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "60211468-1dd1-4611-9009-cba4f4194aad" (UID: "60211468-1dd1-4611-9009-cba4f4194aad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.265239 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60211468-1dd1-4611-9009-cba4f4194aad" (UID: "60211468-1dd1-4611-9009-cba4f4194aad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.294449 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.318606 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-config-data" (OuterVolumeSpecName: "config-data") pod "60211468-1dd1-4611-9009-cba4f4194aad" (UID: "60211468-1dd1-4611-9009-cba4f4194aad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.336154 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.336586 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.336599 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60211468-1dd1-4611-9009-cba4f4194aad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.336608 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.336617 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd61d23a-532f-4ca8-aa16-396c1390d9fa-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.337581 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.440103 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.485778 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-6bb74"] Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.654969 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58b5d8cc69-dbww7"] Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.790088 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58b5d8cc69-dbww7" event={"ID":"6422eb63-373a-4b79-88b0-ddd623f7bd79","Type":"ContainerStarted","Data":"5c83268540159aa38b72bf722a5fe2387b9ac1dcc4d6ec487e0383efdab63ac6"} Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.797576 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60211468-1dd1-4611-9009-cba4f4194aad","Type":"ContainerDied","Data":"daf6d18ccdc14224060c6f099e8325aafb29cbb481423708cac3d8cea2f1c31e"} Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.797643 4780 scope.go:117] "RemoveContainer" containerID="05ffb8597183363a8b21649ea5f016f6dfe4b6c513c7f7168a08f4df425b429f" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.797881 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.826636 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerStarted","Data":"7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd"} Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.829583 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2","Type":"ContainerStarted","Data":"2960a2c291d26fb46da0a2ce001ef0586854f9e68db645bfc03a989166650237"} Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.832222 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4d2b71b-300c-4567-9cb6-9223ceeaef37","Type":"ContainerStarted","Data":"bdda99f2153d22ba1cb1b9a550816fb787343da1b254b6b8fb4d4bcc8ad47421"} Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.843105 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dc4b9644-4fsqf" event={"ID":"fd61d23a-532f-4ca8-aa16-396c1390d9fa","Type":"ContainerDied","Data":"ed8143d8f7fd0e79ca35866f3049b270f37b252d88f9de6b850f8f0d9cf36310"} Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.843414 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dc4b9644-4fsqf" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.869125 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" event={"ID":"c253bf8c-311c-4f8c-ba29-a7533cf02f42","Type":"ContainerDied","Data":"8b37a1bc08aa40ea5829412987ba01fa80e3dae6dc4ff339e98440af9dd6d27d"} Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.869503 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-548b47b48c-fwkrr" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.872023 4780 scope.go:117] "RemoveContainer" containerID="08627053627ce5aafd94b2c4e15902ccd61078df75d6d9838c0d8bdbdded8cd6" Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.880502 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" event={"ID":"b0e696ba-be42-4d7e-8208-d5ff25c7b61c","Type":"ContainerStarted","Data":"590a1d68e37b70764ad99348cd066670a58cf392cf76038d3b197042c2734fd9"} Sep 29 19:02:20 crc kubenswrapper[4780]: I0929 19:02:20.901540 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3608c7b9-1f29-491f-9a10-48135b074fa4","Type":"ContainerStarted","Data":"f9639917e8369f971bbda3bb865e49a2021d379743b2c56bab33019529c9a847"} Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.042970 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.069861 4780 scope.go:117] "RemoveContainer" containerID="377ffcb711fd744856d7c100ebf3648b3f09038e8c8073e3e330c12633505712" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.079771 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.104256 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:02:21 crc kubenswrapper[4780]: E0929 19:02:21.105225 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-log" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.105314 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-log" Sep 29 19:02:21 crc kubenswrapper[4780]: E0929 19:02:21.105413 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api-log" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.105468 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api-log" Sep 29 19:02:21 crc kubenswrapper[4780]: E0929 19:02:21.105526 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.105577 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api" Sep 29 19:02:21 crc kubenswrapper[4780]: E0929 19:02:21.105643 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-httpd" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.105698 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-httpd" Sep 29 19:02:21 crc kubenswrapper[4780]: E0929 19:02:21.105782 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerName="init" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.105849 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerName="init" Sep 29 19:02:21 crc kubenswrapper[4780]: E0929 19:02:21.105933 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerName="dnsmasq-dns" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.106006 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerName="dnsmasq-dns" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.106280 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-log" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.106354 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api-log" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.106432 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="60211468-1dd1-4611-9009-cba4f4194aad" containerName="glance-httpd" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.106571 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" containerName="dnsmasq-dns" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.106636 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" containerName="barbican-api" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.108058 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.113336 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.113688 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.134468 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.237365609 podStartE2EDuration="18.134436643s" podCreationTimestamp="2025-09-29 19:02:03 +0000 UTC" firstStartedPulling="2025-09-29 19:02:04.594636577 +0000 UTC m=+1124.542934621" lastFinishedPulling="2025-09-29 19:02:19.491707611 +0000 UTC m=+1139.440005655" observedRunningTime="2025-09-29 19:02:20.955875501 +0000 UTC m=+1140.904173545" watchObservedRunningTime="2025-09-29 19:02:21.134436643 +0000 UTC m=+1141.082734687" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.155497 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.168782 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55dc4b9644-4fsqf"] Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.176501 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-55dc4b9644-4fsqf"] Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.185644 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-fwkrr"] Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.192081 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-548b47b48c-fwkrr"] Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.225791 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.226007 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.226131 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.226212 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.226255 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.226283 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.226334 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbtvd\" (UniqueName: \"kubernetes.io/projected/b7f300da-65dd-4c6e-ae4a-63b797768651-kube-api-access-kbtvd\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.226422 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.280533 4780 scope.go:117] "RemoveContainer" containerID="2d6c0e2efc44fbbe01f7ffc27d4adeac10c4f1be0262bfbba5f0c30472f1b3c5" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.332180 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.332841 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.332977 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.334040 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.336133 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.336228 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.336290 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbtvd\" (UniqueName: \"kubernetes.io/projected/b7f300da-65dd-4c6e-ae4a-63b797768651-kube-api-access-kbtvd\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.336420 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.336520 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.336734 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.336958 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.342931 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.343735 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.344432 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.350735 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.366569 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbtvd\" (UniqueName: \"kubernetes.io/projected/b7f300da-65dd-4c6e-ae4a-63b797768651-kube-api-access-kbtvd\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.378879 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.385747 4780 scope.go:117] "RemoveContainer" containerID="4aebdec038dbac143882c5c80de1a492b75c0c952e0ef360a640daa9211caf5d" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.451717 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.623008 4780 scope.go:117] "RemoveContainer" containerID="8b720e49ca1d381276eb76f54b37e08c9dcf8df517560ea1d2da5ecf90760b1b" Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.786587 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.787255 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerName="glance-log" containerID="cri-o://01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62" gracePeriod=30 Sep 29 19:02:21 crc kubenswrapper[4780]: I0929 19:02:21.787737 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerName="glance-httpd" containerID="cri-o://0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101" gracePeriod=30 Sep 29 19:02:22 crc kubenswrapper[4780]: I0929 19:02:22.068630 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58b5d8cc69-dbww7" event={"ID":"6422eb63-373a-4b79-88b0-ddd623f7bd79","Type":"ContainerStarted","Data":"8cdf366e564c41077cac425fb60d05141765f8d392ad2a68245952b02c84e442"} Sep 29 19:02:22 crc kubenswrapper[4780]: I0929 19:02:22.115034 4780 generic.go:334] "Generic (PLEG): container finished" podID="b0e696ba-be42-4d7e-8208-d5ff25c7b61c" containerID="6fa93223af11d911ba4d498a61be2d6f0fac14b0ddb7b2edb56c6124c4413c45" exitCode=0 Sep 29 19:02:22 crc kubenswrapper[4780]: I0929 19:02:22.116148 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" event={"ID":"b0e696ba-be42-4d7e-8208-d5ff25c7b61c","Type":"ContainerDied","Data":"6fa93223af11d911ba4d498a61be2d6f0fac14b0ddb7b2edb56c6124c4413c45"} Sep 29 19:02:22 crc kubenswrapper[4780]: I0929 19:02:22.392404 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:02:22 crc kubenswrapper[4780]: W0929 19:02:22.408539 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7f300da_65dd_4c6e_ae4a_63b797768651.slice/crio-b393fe7fa56031a5a162265613befde6b421273597cc8db97ff1d05b2ec32411 WatchSource:0}: Error finding container b393fe7fa56031a5a162265613befde6b421273597cc8db97ff1d05b2ec32411: Status 404 returned error can't find the container with id b393fe7fa56031a5a162265613befde6b421273597cc8db97ff1d05b2ec32411 Sep 29 19:02:22 crc kubenswrapper[4780]: I0929 19:02:22.775922 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60211468-1dd1-4611-9009-cba4f4194aad" path="/var/lib/kubelet/pods/60211468-1dd1-4611-9009-cba4f4194aad/volumes" Sep 29 19:02:22 crc kubenswrapper[4780]: I0929 19:02:22.777136 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c253bf8c-311c-4f8c-ba29-a7533cf02f42" path="/var/lib/kubelet/pods/c253bf8c-311c-4f8c-ba29-a7533cf02f42/volumes" Sep 29 19:02:22 crc kubenswrapper[4780]: I0929 19:02:22.777820 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd61d23a-532f-4ca8-aa16-396c1390d9fa" path="/var/lib/kubelet/pods/fd61d23a-532f-4ca8-aa16-396c1390d9fa/volumes" Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.130363 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerStarted","Data":"6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21"} Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.130862 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.130451 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="ceilometer-central-agent" containerID="cri-o://15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a" gracePeriod=30 Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.130953 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="proxy-httpd" containerID="cri-o://6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21" gracePeriod=30 Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.130980 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="ceilometer-notification-agent" containerID="cri-o://151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f" gracePeriod=30 Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.131037 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="sg-core" containerID="cri-o://7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd" gracePeriod=30 Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.134641 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" event={"ID":"b0e696ba-be42-4d7e-8208-d5ff25c7b61c","Type":"ContainerStarted","Data":"084913df48747b008d8c649ad400b9a86279655d6e345198f5534ceb7c325bcd"} Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.135329 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.148305 4780 generic.go:334] "Generic (PLEG): container finished" podID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerID="01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62" exitCode=143 Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.148611 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7a98719-c1af-40eb-a2e2-b711001d277c","Type":"ContainerDied","Data":"01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62"} Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.159343 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4d2b71b-300c-4567-9cb6-9223ceeaef37","Type":"ContainerStarted","Data":"8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923"} Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.172879 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2","Type":"ContainerStarted","Data":"ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5"} Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.187704 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7f300da-65dd-4c6e-ae4a-63b797768651","Type":"ContainerStarted","Data":"b393fe7fa56031a5a162265613befde6b421273597cc8db97ff1d05b2ec32411"} Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.187778 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.187797 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58b5d8cc69-dbww7" event={"ID":"6422eb63-373a-4b79-88b0-ddd623f7bd79","Type":"ContainerStarted","Data":"2f710c5ce82ca39295b8a385c093185e1904e19720dae5eaeaae61d9187c8809"} Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.187810 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.212796 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.652136623 podStartE2EDuration="16.212764509s" podCreationTimestamp="2025-09-29 19:02:07 +0000 UTC" firstStartedPulling="2025-09-29 19:02:08.855740343 +0000 UTC m=+1128.804038387" lastFinishedPulling="2025-09-29 19:02:21.416368229 +0000 UTC m=+1141.364666273" observedRunningTime="2025-09-29 19:02:23.160932952 +0000 UTC m=+1143.109230996" watchObservedRunningTime="2025-09-29 19:02:23.212764509 +0000 UTC m=+1143.161062553" Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.236413 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" podStartSLOduration=11.236389577 podStartE2EDuration="11.236389577s" podCreationTimestamp="2025-09-29 19:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:23.189817779 +0000 UTC m=+1143.138115833" watchObservedRunningTime="2025-09-29 19:02:23.236389577 +0000 UTC m=+1143.184687621" Sep 29 19:02:23 crc kubenswrapper[4780]: I0929 19:02:23.269973 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-58b5d8cc69-dbww7" podStartSLOduration=11.269950355 podStartE2EDuration="11.269950355s" podCreationTimestamp="2025-09-29 19:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:23.259737777 +0000 UTC m=+1143.208035821" watchObservedRunningTime="2025-09-29 19:02:23.269950355 +0000 UTC m=+1143.218248399" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.193306 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4d2b71b-300c-4567-9cb6-9223ceeaef37","Type":"ContainerStarted","Data":"b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126"} Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.193670 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerName="cinder-api-log" containerID="cri-o://8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923" gracePeriod=30 Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.194096 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.194199 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerName="cinder-api" containerID="cri-o://b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126" gracePeriod=30 Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.205721 4780 generic.go:334] "Generic (PLEG): container finished" podID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerID="6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21" exitCode=0 Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.205769 4780 generic.go:334] "Generic (PLEG): container finished" podID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerID="7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd" exitCode=2 Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.205777 4780 generic.go:334] "Generic (PLEG): container finished" podID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerID="15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a" exitCode=0 Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.205817 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerDied","Data":"6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21"} Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.205887 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerDied","Data":"7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd"} Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.205903 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerDied","Data":"15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a"} Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.209838 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2","Type":"ContainerStarted","Data":"f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6"} Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.216670 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7f300da-65dd-4c6e-ae4a-63b797768651","Type":"ContainerStarted","Data":"02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150"} Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.234274 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=12.234247375 podStartE2EDuration="12.234247375s" podCreationTimestamp="2025-09-29 19:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:24.217215114 +0000 UTC m=+1144.165513178" watchObservedRunningTime="2025-09-29 19:02:24.234247375 +0000 UTC m=+1144.182545419" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.248097 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=12.104520526 podStartE2EDuration="13.248072077s" podCreationTimestamp="2025-09-29 19:02:11 +0000 UTC" firstStartedPulling="2025-09-29 19:02:20.272818028 +0000 UTC m=+1140.221116062" lastFinishedPulling="2025-09-29 19:02:21.416369569 +0000 UTC m=+1141.364667613" observedRunningTime="2025-09-29 19:02:24.244544457 +0000 UTC m=+1144.192842491" watchObservedRunningTime="2025-09-29 19:02:24.248072077 +0000 UTC m=+1144.196370121" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.915076 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.941535 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data-custom\") pod \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.942079 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4d2b71b-300c-4567-9cb6-9223ceeaef37-etc-machine-id\") pod \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.942214 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4d2b71b-300c-4567-9cb6-9223ceeaef37-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a4d2b71b-300c-4567-9cb6-9223ceeaef37" (UID: "a4d2b71b-300c-4567-9cb6-9223ceeaef37"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.942438 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data\") pod \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.943199 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-scripts\") pod \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.943375 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d2b71b-300c-4567-9cb6-9223ceeaef37-logs\") pod \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.943521 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb97x\" (UniqueName: \"kubernetes.io/projected/a4d2b71b-300c-4567-9cb6-9223ceeaef37-kube-api-access-wb97x\") pod \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.943595 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-combined-ca-bundle\") pod \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\" (UID: \"a4d2b71b-300c-4567-9cb6-9223ceeaef37\") " Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.944254 4780 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4d2b71b-300c-4567-9cb6-9223ceeaef37-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.946101 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d2b71b-300c-4567-9cb6-9223ceeaef37-logs" (OuterVolumeSpecName: "logs") pod "a4d2b71b-300c-4567-9cb6-9223ceeaef37" (UID: "a4d2b71b-300c-4567-9cb6-9223ceeaef37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.951205 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-scripts" (OuterVolumeSpecName: "scripts") pod "a4d2b71b-300c-4567-9cb6-9223ceeaef37" (UID: "a4d2b71b-300c-4567-9cb6-9223ceeaef37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.952467 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4d2b71b-300c-4567-9cb6-9223ceeaef37" (UID: "a4d2b71b-300c-4567-9cb6-9223ceeaef37"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:24 crc kubenswrapper[4780]: I0929 19:02:24.960343 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d2b71b-300c-4567-9cb6-9223ceeaef37-kube-api-access-wb97x" (OuterVolumeSpecName: "kube-api-access-wb97x") pod "a4d2b71b-300c-4567-9cb6-9223ceeaef37" (UID: "a4d2b71b-300c-4567-9cb6-9223ceeaef37"). InnerVolumeSpecName "kube-api-access-wb97x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.007236 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4d2b71b-300c-4567-9cb6-9223ceeaef37" (UID: "a4d2b71b-300c-4567-9cb6-9223ceeaef37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.019149 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data" (OuterVolumeSpecName: "config-data") pod "a4d2b71b-300c-4567-9cb6-9223ceeaef37" (UID: "a4d2b71b-300c-4567-9cb6-9223ceeaef37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.045544 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.045822 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.045912 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.045991 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d2b71b-300c-4567-9cb6-9223ceeaef37-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.046077 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb97x\" (UniqueName: \"kubernetes.io/projected/a4d2b71b-300c-4567-9cb6-9223ceeaef37-kube-api-access-wb97x\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.046178 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d2b71b-300c-4567-9cb6-9223ceeaef37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.231904 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7f300da-65dd-4c6e-ae4a-63b797768651","Type":"ContainerStarted","Data":"724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c"} Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.236042 4780 generic.go:334] "Generic (PLEG): container finished" podID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerID="b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126" exitCode=0 Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.236096 4780 generic.go:334] "Generic (PLEG): container finished" podID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerID="8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923" exitCode=143 Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.236148 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.236301 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4d2b71b-300c-4567-9cb6-9223ceeaef37","Type":"ContainerDied","Data":"b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126"} Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.236424 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4d2b71b-300c-4567-9cb6-9223ceeaef37","Type":"ContainerDied","Data":"8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923"} Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.236521 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4d2b71b-300c-4567-9cb6-9223ceeaef37","Type":"ContainerDied","Data":"bdda99f2153d22ba1cb1b9a550816fb787343da1b254b6b8fb4d4bcc8ad47421"} Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.236556 4780 scope.go:117] "RemoveContainer" containerID="b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.264220 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.264192133 podStartE2EDuration="5.264192133s" podCreationTimestamp="2025-09-29 19:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:25.255100075 +0000 UTC m=+1145.203398119" watchObservedRunningTime="2025-09-29 19:02:25.264192133 +0000 UTC m=+1145.212490177" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.288165 4780 scope.go:117] "RemoveContainer" containerID="8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.294594 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.306455 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.336858 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:02:25 crc kubenswrapper[4780]: E0929 19:02:25.337447 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerName="cinder-api-log" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.337467 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerName="cinder-api-log" Sep 29 19:02:25 crc kubenswrapper[4780]: E0929 19:02:25.337507 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerName="cinder-api" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.337518 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerName="cinder-api" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.337734 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerName="cinder-api-log" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.337758 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" containerName="cinder-api" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.338999 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.343037 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.351656 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.353138 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.353338 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.356486 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.356632 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-logs\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.356723 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.356863 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.357017 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-scripts\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.357148 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data-custom\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.357238 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.438436 4780 scope.go:117] "RemoveContainer" containerID="b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126" Sep 29 19:02:25 crc kubenswrapper[4780]: E0929 19:02:25.439244 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126\": container with ID starting with b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126 not found: ID does not exist" containerID="b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.439297 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126"} err="failed to get container status \"b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126\": rpc error: code = NotFound desc = could not find container \"b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126\": container with ID starting with b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126 not found: ID does not exist" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.439327 4780 scope.go:117] "RemoveContainer" containerID="8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923" Sep 29 19:02:25 crc kubenswrapper[4780]: E0929 19:02:25.439682 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923\": container with ID starting with 8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923 not found: ID does not exist" containerID="8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.439729 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923"} err="failed to get container status \"8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923\": rpc error: code = NotFound desc = could not find container \"8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923\": container with ID starting with 8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923 not found: ID does not exist" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.439749 4780 scope.go:117] "RemoveContainer" containerID="b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.440015 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126"} err="failed to get container status \"b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126\": rpc error: code = NotFound desc = could not find container \"b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126\": container with ID starting with b6b130caaaeb3c2f33cb4544df4631326496bfc85bcf56cdb1598dbd45f95126 not found: ID does not exist" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.440063 4780 scope.go:117] "RemoveContainer" containerID="8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.443538 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923"} err="failed to get container status \"8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923\": rpc error: code = NotFound desc = could not find container \"8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923\": container with ID starting with 8e0382881b5b9d1feff9b94c744cb6e22dd7ae98f8dca50491a9712c24b6e923 not found: ID does not exist" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.459922 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.460136 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.460206 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-logs\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.460230 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.460302 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.460349 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-scripts\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.460378 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data-custom\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.460425 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.460567 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg62t\" (UniqueName: \"kubernetes.io/projected/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-kube-api-access-bg62t\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.462275 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.462741 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-logs\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.465438 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.466079 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data-custom\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.466098 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-scripts\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.466981 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.469208 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.567221 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg62t\" (UniqueName: \"kubernetes.io/projected/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-kube-api-access-bg62t\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.567378 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.582331 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.598828 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg62t\" (UniqueName: \"kubernetes.io/projected/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-kube-api-access-bg62t\") pod \"cinder-api-0\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.739647 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.866609 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.883934 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-scripts\") pod \"b15b3eff-0997-4228-af66-1f9caecc40bc\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.883999 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmrsv\" (UniqueName: \"kubernetes.io/projected/b15b3eff-0997-4228-af66-1f9caecc40bc-kube-api-access-gmrsv\") pod \"b15b3eff-0997-4228-af66-1f9caecc40bc\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.884872 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-run-httpd\") pod \"b15b3eff-0997-4228-af66-1f9caecc40bc\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.884966 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-sg-core-conf-yaml\") pod \"b15b3eff-0997-4228-af66-1f9caecc40bc\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.885066 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-config-data\") pod \"b15b3eff-0997-4228-af66-1f9caecc40bc\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.885111 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-combined-ca-bundle\") pod \"b15b3eff-0997-4228-af66-1f9caecc40bc\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.885134 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-log-httpd\") pod \"b15b3eff-0997-4228-af66-1f9caecc40bc\" (UID: \"b15b3eff-0997-4228-af66-1f9caecc40bc\") " Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.885316 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b15b3eff-0997-4228-af66-1f9caecc40bc" (UID: "b15b3eff-0997-4228-af66-1f9caecc40bc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.885633 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.885952 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b15b3eff-0997-4228-af66-1f9caecc40bc" (UID: "b15b3eff-0997-4228-af66-1f9caecc40bc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.891759 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15b3eff-0997-4228-af66-1f9caecc40bc-kube-api-access-gmrsv" (OuterVolumeSpecName: "kube-api-access-gmrsv") pod "b15b3eff-0997-4228-af66-1f9caecc40bc" (UID: "b15b3eff-0997-4228-af66-1f9caecc40bc"). InnerVolumeSpecName "kube-api-access-gmrsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.896725 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-scripts" (OuterVolumeSpecName: "scripts") pod "b15b3eff-0997-4228-af66-1f9caecc40bc" (UID: "b15b3eff-0997-4228-af66-1f9caecc40bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:25 crc kubenswrapper[4780]: I0929 19:02:25.937818 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b15b3eff-0997-4228-af66-1f9caecc40bc" (UID: "b15b3eff-0997-4228-af66-1f9caecc40bc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.014458 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.014506 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmrsv\" (UniqueName: \"kubernetes.io/projected/b15b3eff-0997-4228-af66-1f9caecc40bc-kube-api-access-gmrsv\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.014521 4780 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.014551 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b15b3eff-0997-4228-af66-1f9caecc40bc-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.045900 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-config-data" (OuterVolumeSpecName: "config-data") pod "b15b3eff-0997-4228-af66-1f9caecc40bc" (UID: "b15b3eff-0997-4228-af66-1f9caecc40bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.074624 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b15b3eff-0997-4228-af66-1f9caecc40bc" (UID: "b15b3eff-0997-4228-af66-1f9caecc40bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.087777 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.117901 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-public-tls-certs\") pod \"a7a98719-c1af-40eb-a2e2-b711001d277c\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.118000 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-httpd-run\") pod \"a7a98719-c1af-40eb-a2e2-b711001d277c\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.118203 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-config-data\") pod \"a7a98719-c1af-40eb-a2e2-b711001d277c\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.118304 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-combined-ca-bundle\") pod \"a7a98719-c1af-40eb-a2e2-b711001d277c\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.118333 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkbp5\" (UniqueName: \"kubernetes.io/projected/a7a98719-c1af-40eb-a2e2-b711001d277c-kube-api-access-mkbp5\") pod \"a7a98719-c1af-40eb-a2e2-b711001d277c\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.118477 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"a7a98719-c1af-40eb-a2e2-b711001d277c\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.118507 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-logs\") pod \"a7a98719-c1af-40eb-a2e2-b711001d277c\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.118538 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-scripts\") pod \"a7a98719-c1af-40eb-a2e2-b711001d277c\" (UID: \"a7a98719-c1af-40eb-a2e2-b711001d277c\") " Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.119002 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.119017 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15b3eff-0997-4228-af66-1f9caecc40bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.119123 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a7a98719-c1af-40eb-a2e2-b711001d277c" (UID: "a7a98719-c1af-40eb-a2e2-b711001d277c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.119739 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-logs" (OuterVolumeSpecName: "logs") pod "a7a98719-c1af-40eb-a2e2-b711001d277c" (UID: "a7a98719-c1af-40eb-a2e2-b711001d277c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.132356 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "a7a98719-c1af-40eb-a2e2-b711001d277c" (UID: "a7a98719-c1af-40eb-a2e2-b711001d277c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.132738 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-scripts" (OuterVolumeSpecName: "scripts") pod "a7a98719-c1af-40eb-a2e2-b711001d277c" (UID: "a7a98719-c1af-40eb-a2e2-b711001d277c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.136739 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a98719-c1af-40eb-a2e2-b711001d277c-kube-api-access-mkbp5" (OuterVolumeSpecName: "kube-api-access-mkbp5") pod "a7a98719-c1af-40eb-a2e2-b711001d277c" (UID: "a7a98719-c1af-40eb-a2e2-b711001d277c"). InnerVolumeSpecName "kube-api-access-mkbp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.175242 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7a98719-c1af-40eb-a2e2-b711001d277c" (UID: "a7a98719-c1af-40eb-a2e2-b711001d277c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.210565 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-config-data" (OuterVolumeSpecName: "config-data") pod "a7a98719-c1af-40eb-a2e2-b711001d277c" (UID: "a7a98719-c1af-40eb-a2e2-b711001d277c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.214375 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a7a98719-c1af-40eb-a2e2-b711001d277c" (UID: "a7a98719-c1af-40eb-a2e2-b711001d277c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.221970 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.222232 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkbp5\" (UniqueName: \"kubernetes.io/projected/a7a98719-c1af-40eb-a2e2-b711001d277c-kube-api-access-mkbp5\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.222328 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.222395 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.222451 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.222605 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.222667 4780 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7a98719-c1af-40eb-a2e2-b711001d277c-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.222721 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a98719-c1af-40eb-a2e2-b711001d277c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.266271 4780 generic.go:334] "Generic (PLEG): container finished" podID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerID="151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f" exitCode=0 Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.266355 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerDied","Data":"151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f"} Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.266385 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b15b3eff-0997-4228-af66-1f9caecc40bc","Type":"ContainerDied","Data":"004214a5b6b5b5995acea722bd8c6ffb32c188e008ed493e304902d6eed229a7"} Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.266403 4780 scope.go:117] "RemoveContainer" containerID="6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.266549 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.270491 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.293287 4780 generic.go:334] "Generic (PLEG): container finished" podID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerID="0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101" exitCode=0 Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.293435 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7a98719-c1af-40eb-a2e2-b711001d277c","Type":"ContainerDied","Data":"0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101"} Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.293481 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7a98719-c1af-40eb-a2e2-b711001d277c","Type":"ContainerDied","Data":"070e9830cfb317193a6f8fb2068ee4293313036cd87383713ef169bcda15daf5"} Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.293599 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.324879 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.404455 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.433166 4780 scope.go:117] "RemoveContainer" containerID="7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.437293 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.454109 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.505369 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.513686 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.514321 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="ceilometer-central-agent" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514342 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="ceilometer-central-agent" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.514357 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerName="glance-httpd" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514365 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerName="glance-httpd" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.514384 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerName="glance-log" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514391 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerName="glance-log" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.514407 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="ceilometer-notification-agent" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514414 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="ceilometer-notification-agent" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.514425 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="proxy-httpd" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514433 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="proxy-httpd" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.514448 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="sg-core" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514457 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="sg-core" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514676 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="proxy-httpd" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514695 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="ceilometer-central-agent" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514708 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerName="glance-log" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514731 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a98719-c1af-40eb-a2e2-b711001d277c" containerName="glance-httpd" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514749 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="sg-core" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.514767 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" containerName="ceilometer-notification-agent" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.516810 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.519902 4780 scope.go:117] "RemoveContainer" containerID="151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.526467 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.526706 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.557323 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.567790 4780 scope.go:117] "RemoveContainer" containerID="15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.586861 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.602454 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.606550 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.610089 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.611015 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.612494 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635443 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635492 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635563 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-logs\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635604 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635651 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-scripts\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635677 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635713 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-run-httpd\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635742 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-log-httpd\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635771 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635792 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-scripts\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635837 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-config-data\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635864 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xts\" (UniqueName: \"kubernetes.io/projected/e14f7a20-d45e-4662-b0db-4af394c7daed-kube-api-access-l6xts\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635893 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-config-data\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635917 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.635958 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plpd4\" (UniqueName: \"kubernetes.io/projected/92561738-6b0f-4afb-86cf-c4b129880383-kube-api-access-plpd4\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.650853 4780 scope.go:117] "RemoveContainer" containerID="6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.651474 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21\": container with ID starting with 6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21 not found: ID does not exist" containerID="6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.651531 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21"} err="failed to get container status \"6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21\": rpc error: code = NotFound desc = could not find container \"6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21\": container with ID starting with 6ca5394e41df7ff1ce10f500b7555c02e6e46a34fe70e59eb069f0fa3a4ecf21 not found: ID does not exist" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.651565 4780 scope.go:117] "RemoveContainer" containerID="7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.651858 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd\": container with ID starting with 7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd not found: ID does not exist" containerID="7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.651887 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd"} err="failed to get container status \"7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd\": rpc error: code = NotFound desc = could not find container \"7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd\": container with ID starting with 7f6c01cd1894cab3cb777ea7fae99a193a069263f4833acf84659408a7cf11dd not found: ID does not exist" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.651911 4780 scope.go:117] "RemoveContainer" containerID="151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.652134 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f\": container with ID starting with 151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f not found: ID does not exist" containerID="151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.652157 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f"} err="failed to get container status \"151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f\": rpc error: code = NotFound desc = could not find container \"151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f\": container with ID starting with 151c13b2820408fc66b589fbb17e26cae3a32c33925562429eeb3a3ee4b7553f not found: ID does not exist" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.652173 4780 scope.go:117] "RemoveContainer" containerID="15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.652400 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a\": container with ID starting with 15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a not found: ID does not exist" containerID="15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.652422 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a"} err="failed to get container status \"15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a\": rpc error: code = NotFound desc = could not find container \"15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a\": container with ID starting with 15e4a5b4196338e3be361ac988bd8b074b228c35c1c6559c83977485fba2210a not found: ID does not exist" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.652450 4780 scope.go:117] "RemoveContainer" containerID="0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.681554 4780 scope.go:117] "RemoveContainer" containerID="01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.721514 4780 scope.go:117] "RemoveContainer" containerID="0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.722091 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101\": container with ID starting with 0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101 not found: ID does not exist" containerID="0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.722145 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101"} err="failed to get container status \"0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101\": rpc error: code = NotFound desc = could not find container \"0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101\": container with ID starting with 0bb78244148268f3bc0a2a6000af73964a09beffb84213d07e14be5394c6e101 not found: ID does not exist" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.722222 4780 scope.go:117] "RemoveContainer" containerID="01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62" Sep 29 19:02:26 crc kubenswrapper[4780]: E0929 19:02:26.722880 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62\": container with ID starting with 01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62 not found: ID does not exist" containerID="01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.722913 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62"} err="failed to get container status \"01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62\": rpc error: code = NotFound desc = could not find container \"01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62\": container with ID starting with 01797a29e17b4bf02af17cd2d5631ff45edc54ed3c51f75909185bd7dfca5a62 not found: ID does not exist" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.739898 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-logs\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740540 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740600 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-scripts\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740624 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740665 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-run-httpd\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740693 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-log-httpd\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740729 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740752 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-scripts\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740805 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-config-data\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740833 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xts\" (UniqueName: \"kubernetes.io/projected/e14f7a20-d45e-4662-b0db-4af394c7daed-kube-api-access-l6xts\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740861 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-config-data\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740884 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740931 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plpd4\" (UniqueName: \"kubernetes.io/projected/92561738-6b0f-4afb-86cf-c4b129880383-kube-api-access-plpd4\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740979 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.741005 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.742185 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.740446 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-logs\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.742378 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.743557 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-run-httpd\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.743559 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-log-httpd\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.750106 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-scripts\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.750113 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.751084 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.752985 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-scripts\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.775639 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.776397 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-config-data\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.778123 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-config-data\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.786752 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xts\" (UniqueName: \"kubernetes.io/projected/e14f7a20-d45e-4662-b0db-4af394c7daed-kube-api-access-l6xts\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.786880 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plpd4\" (UniqueName: \"kubernetes.io/projected/92561738-6b0f-4afb-86cf-c4b129880383-kube-api-access-plpd4\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.791428 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d2b71b-300c-4567-9cb6-9223ceeaef37" path="/var/lib/kubelet/pods/a4d2b71b-300c-4567-9cb6-9223ceeaef37/volumes" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.792806 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a98719-c1af-40eb-a2e2-b711001d277c" path="/var/lib/kubelet/pods/a7a98719-c1af-40eb-a2e2-b711001d277c/volumes" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.799082 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15b3eff-0997-4228-af66-1f9caecc40bc" path="/var/lib/kubelet/pods/b15b3eff-0997-4228-af66-1f9caecc40bc/volumes" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.808429 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " pod="openstack/glance-default-external-api-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.815494 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.848162 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:26 crc kubenswrapper[4780]: I0929 19:02:26.957141 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.254682 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.321131 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d","Type":"ContainerStarted","Data":"5b0cd11d894df6419f2647d47c9d408f4434a3a224381a40853e23c6715badbf"} Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.445560 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.600861 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.685445 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.695556 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5674f66f87-vrjks"] Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.695886 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" podUID="01004ec9-c3e3-4549-abbf-94af0692c0b1" containerName="dnsmasq-dns" containerID="cri-o://a48578dfa62158c3d36a1c8e328dc6d6a0dd83fa657462cb689de3bb8eb360c5" gracePeriod=10 Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.733451 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.825217 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.841748 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" podUID="01004ec9-c3e3-4549-abbf-94af0692c0b1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: connect: connection refused" Sep 29 19:02:27 crc kubenswrapper[4780]: I0929 19:02:27.965570 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.273278 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.312993 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.346319 4780 generic.go:334] "Generic (PLEG): container finished" podID="01004ec9-c3e3-4549-abbf-94af0692c0b1" containerID="a48578dfa62158c3d36a1c8e328dc6d6a0dd83fa657462cb689de3bb8eb360c5" exitCode=0 Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.346420 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" event={"ID":"01004ec9-c3e3-4549-abbf-94af0692c0b1","Type":"ContainerDied","Data":"a48578dfa62158c3d36a1c8e328dc6d6a0dd83fa657462cb689de3bb8eb360c5"} Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.349572 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d","Type":"ContainerStarted","Data":"aae1a7e720cb23ff6cab4d895d4d7d7fe47acc5b243d3c4f6eaa4b6fe46a9e00"} Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.350876 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerStarted","Data":"3bda6a3990989c5de963820a7909c40b3ae3cd9664dcb9470952746431a50e29"} Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.353182 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e14f7a20-d45e-4662-b0db-4af394c7daed","Type":"ContainerStarted","Data":"dfe968157267fe3b8b2e7d077102925efda37cf4ef744e99b8e1bdfe9da91a6e"} Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.353364 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerName="cinder-scheduler" containerID="cri-o://ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5" gracePeriod=30 Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.353479 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerName="probe" containerID="cri-o://f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6" gracePeriod=30 Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.773385 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.919412 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-swift-storage-0\") pod \"01004ec9-c3e3-4549-abbf-94af0692c0b1\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.920220 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-svc\") pod \"01004ec9-c3e3-4549-abbf-94af0692c0b1\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.920306 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drdw9\" (UniqueName: \"kubernetes.io/projected/01004ec9-c3e3-4549-abbf-94af0692c0b1-kube-api-access-drdw9\") pod \"01004ec9-c3e3-4549-abbf-94af0692c0b1\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.920340 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-sb\") pod \"01004ec9-c3e3-4549-abbf-94af0692c0b1\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.920623 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-nb\") pod \"01004ec9-c3e3-4549-abbf-94af0692c0b1\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.920663 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-config\") pod \"01004ec9-c3e3-4549-abbf-94af0692c0b1\" (UID: \"01004ec9-c3e3-4549-abbf-94af0692c0b1\") " Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.932099 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01004ec9-c3e3-4549-abbf-94af0692c0b1-kube-api-access-drdw9" (OuterVolumeSpecName: "kube-api-access-drdw9") pod "01004ec9-c3e3-4549-abbf-94af0692c0b1" (UID: "01004ec9-c3e3-4549-abbf-94af0692c0b1"). InnerVolumeSpecName "kube-api-access-drdw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:28 crc kubenswrapper[4780]: I0929 19:02:28.992565 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-config" (OuterVolumeSpecName: "config") pod "01004ec9-c3e3-4549-abbf-94af0692c0b1" (UID: "01004ec9-c3e3-4549-abbf-94af0692c0b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.023537 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drdw9\" (UniqueName: \"kubernetes.io/projected/01004ec9-c3e3-4549-abbf-94af0692c0b1-kube-api-access-drdw9\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.023569 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.029646 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "01004ec9-c3e3-4549-abbf-94af0692c0b1" (UID: "01004ec9-c3e3-4549-abbf-94af0692c0b1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.050661 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "01004ec9-c3e3-4549-abbf-94af0692c0b1" (UID: "01004ec9-c3e3-4549-abbf-94af0692c0b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.058894 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01004ec9-c3e3-4549-abbf-94af0692c0b1" (UID: "01004ec9-c3e3-4549-abbf-94af0692c0b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.114880 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "01004ec9-c3e3-4549-abbf-94af0692c0b1" (UID: "01004ec9-c3e3-4549-abbf-94af0692c0b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.127815 4780 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.127859 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.127869 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.127885 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01004ec9-c3e3-4549-abbf-94af0692c0b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.364174 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e14f7a20-d45e-4662-b0db-4af394c7daed","Type":"ContainerStarted","Data":"22c83df1dfa900462fb0bdf93010df94b1f1fbd660599f3ce6a52119f57afbe9"} Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.409760 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" event={"ID":"01004ec9-c3e3-4549-abbf-94af0692c0b1","Type":"ContainerDied","Data":"825316ce0117aa509bed175269eecfb920d7ed775271bf5915fb6158a44764ef"} Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.409850 4780 scope.go:117] "RemoveContainer" containerID="a48578dfa62158c3d36a1c8e328dc6d6a0dd83fa657462cb689de3bb8eb360c5" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.410144 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5674f66f87-vrjks" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.431334 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d","Type":"ContainerStarted","Data":"391cc111e8cd575fa81674aac39e64f1e0c3b2f3fc46853f4758411b706b35aa"} Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.431530 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.451161 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerStarted","Data":"82ea9e8eb2a77addfa0b90f1cb3512739bca3354479a9e8624fd56c431aa57a5"} Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.465858 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.465815005 podStartE2EDuration="4.465815005s" podCreationTimestamp="2025-09-29 19:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:29.46281571 +0000 UTC m=+1149.411113764" watchObservedRunningTime="2025-09-29 19:02:29.465815005 +0000 UTC m=+1149.414113179" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.577192 4780 scope.go:117] "RemoveContainer" containerID="3b45ce17ef2243b071119c687f635bd1e57c2a3657b1ca7ab22004546aaa3940" Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.598732 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5674f66f87-vrjks"] Sep 29 19:02:29 crc kubenswrapper[4780]: I0929 19:02:29.618556 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5674f66f87-vrjks"] Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.458507 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.479584 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerStarted","Data":"03ae06f78dc0c07ea23a8c0d99ca4816b7483d6e05af27e8405d42df5a2dcd7d"} Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.485844 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e14f7a20-d45e-4662-b0db-4af394c7daed","Type":"ContainerStarted","Data":"eeee69b0a809e51c2de8aae84184f344369f4e4f6fab7ebfa4f65f602565ed13"} Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.499583 4780 generic.go:334] "Generic (PLEG): container finished" podID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerID="f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6" exitCode=0 Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.499624 4780 generic.go:334] "Generic (PLEG): container finished" podID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerID="ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5" exitCode=0 Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.500535 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.500697 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2","Type":"ContainerDied","Data":"f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6"} Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.500725 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2","Type":"ContainerDied","Data":"ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5"} Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.500736 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2","Type":"ContainerDied","Data":"2960a2c291d26fb46da0a2ce001ef0586854f9e68db645bfc03a989166650237"} Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.500753 4780 scope.go:117] "RemoveContainer" containerID="f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.535885 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.535853435 podStartE2EDuration="4.535853435s" podCreationTimestamp="2025-09-29 19:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:30.527467978 +0000 UTC m=+1150.475766022" watchObservedRunningTime="2025-09-29 19:02:30.535853435 +0000 UTC m=+1150.484151469" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.588174 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-combined-ca-bundle\") pod \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.588265 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq8j6\" (UniqueName: \"kubernetes.io/projected/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-kube-api-access-zq8j6\") pod \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.588335 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-etc-machine-id\") pod \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.588418 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data-custom\") pod \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.588472 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data\") pod \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.588548 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-scripts\") pod \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\" (UID: \"c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2\") " Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.592397 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" (UID: "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.597501 4780 scope.go:117] "RemoveContainer" containerID="ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.601246 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-scripts" (OuterVolumeSpecName: "scripts") pod "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" (UID: "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.601395 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-kube-api-access-zq8j6" (OuterVolumeSpecName: "kube-api-access-zq8j6") pod "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" (UID: "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2"). InnerVolumeSpecName "kube-api-access-zq8j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.601717 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" (UID: "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.636132 4780 scope.go:117] "RemoveContainer" containerID="f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6" Sep 29 19:02:30 crc kubenswrapper[4780]: E0929 19:02:30.636832 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6\": container with ID starting with f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6 not found: ID does not exist" containerID="f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.636892 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6"} err="failed to get container status \"f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6\": rpc error: code = NotFound desc = could not find container \"f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6\": container with ID starting with f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6 not found: ID does not exist" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.636931 4780 scope.go:117] "RemoveContainer" containerID="ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5" Sep 29 19:02:30 crc kubenswrapper[4780]: E0929 19:02:30.652351 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5\": container with ID starting with ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5 not found: ID does not exist" containerID="ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.652416 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5"} err="failed to get container status \"ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5\": rpc error: code = NotFound desc = could not find container \"ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5\": container with ID starting with ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5 not found: ID does not exist" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.652458 4780 scope.go:117] "RemoveContainer" containerID="f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.652814 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6"} err="failed to get container status \"f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6\": rpc error: code = NotFound desc = could not find container \"f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6\": container with ID starting with f4f315ca796827fda71573ad1de921b33c074961cd8dba45e4e2997a5a8589d6 not found: ID does not exist" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.652841 4780 scope.go:117] "RemoveContainer" containerID="ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.653238 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5"} err="failed to get container status \"ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5\": rpc error: code = NotFound desc = could not find container \"ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5\": container with ID starting with ec038b2ee77ecfb3c0d0ba6f2046db48c80798521b6f55ed6d89d431773261d5 not found: ID does not exist" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.677819 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" (UID: "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.693947 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.693995 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.694014 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.694029 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq8j6\" (UniqueName: \"kubernetes.io/projected/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-kube-api-access-zq8j6\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.694059 4780 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.746280 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data" (OuterVolumeSpecName: "config-data") pod "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" (UID: "c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.779804 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01004ec9-c3e3-4549-abbf-94af0692c0b1" path="/var/lib/kubelet/pods/01004ec9-c3e3-4549-abbf-94af0692c0b1/volumes" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.787225 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.797257 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.865883 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.874995 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.890009 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:02:30 crc kubenswrapper[4780]: E0929 19:02:30.891780 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01004ec9-c3e3-4549-abbf-94af0692c0b1" containerName="dnsmasq-dns" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.891812 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="01004ec9-c3e3-4549-abbf-94af0692c0b1" containerName="dnsmasq-dns" Sep 29 19:02:30 crc kubenswrapper[4780]: E0929 19:02:30.891878 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerName="probe" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.891895 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerName="probe" Sep 29 19:02:30 crc kubenswrapper[4780]: E0929 19:02:30.891920 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerName="cinder-scheduler" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.891930 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerName="cinder-scheduler" Sep 29 19:02:30 crc kubenswrapper[4780]: E0929 19:02:30.891989 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01004ec9-c3e3-4549-abbf-94af0692c0b1" containerName="init" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.892002 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="01004ec9-c3e3-4549-abbf-94af0692c0b1" containerName="init" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.902701 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerName="probe" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.902803 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" containerName="cinder-scheduler" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.902856 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="01004ec9-c3e3-4549-abbf-94af0692c0b1" containerName="dnsmasq-dns" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.911567 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.922883 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 29 19:02:30 crc kubenswrapper[4780]: I0929 19:02:30.936749 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.019287 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.019377 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.019420 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.019458 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqknp\" (UniqueName: \"kubernetes.io/projected/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-kube-api-access-wqknp\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.019516 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-scripts\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.019543 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.122469 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqknp\" (UniqueName: \"kubernetes.io/projected/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-kube-api-access-wqknp\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.122960 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-scripts\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.123078 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.123243 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.123374 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.123501 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.124222 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.131208 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-scripts\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.133726 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.134951 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.135104 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.153773 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqknp\" (UniqueName: \"kubernetes.io/projected/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-kube-api-access-wqknp\") pod \"cinder-scheduler-0\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.255124 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.451983 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.452026 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.523783 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.525851 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.536012 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerStarted","Data":"32a28a013c0707d5bffc69991008150bc58a3dd41aed0c1be6e11a4c7eab4240"} Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.537176 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.537238 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:31 crc kubenswrapper[4780]: I0929 19:02:31.792475 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.478917 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.565369 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2903cdd8-3ab5-4c85-892c-2139eb0bde7c","Type":"ContainerStarted","Data":"2bed75a72ebe1e8129dcb4991c90de2501dc80eaaa5a5d00e170c5bcd8aefd4f"} Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.565430 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2903cdd8-3ab5-4c85-892c-2139eb0bde7c","Type":"ContainerStarted","Data":"aa91dbe2bcb995257ece6a558d9d8700ad7320f1ed8d1df9fc1b87bfe9c9a0f4"} Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.582023 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-686fd87d4d-xmdcq"] Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.582714 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-686fd87d4d-xmdcq" podUID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerName="neutron-api" containerID="cri-o://8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195" gracePeriod=30 Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.583828 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-686fd87d4d-xmdcq" podUID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerName="neutron-httpd" containerID="cri-o://ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a" gracePeriod=30 Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.631521 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerStarted","Data":"80736a7ad3ff0e9af366b896294e7934940a595cef7b82174100610188e8924b"} Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.631578 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="ceilometer-central-agent" containerID="cri-o://82ea9e8eb2a77addfa0b90f1cb3512739bca3354479a9e8624fd56c431aa57a5" gracePeriod=30 Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.631756 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="proxy-httpd" containerID="cri-o://80736a7ad3ff0e9af366b896294e7934940a595cef7b82174100610188e8924b" gracePeriod=30 Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.631826 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="sg-core" containerID="cri-o://32a28a013c0707d5bffc69991008150bc58a3dd41aed0c1be6e11a4c7eab4240" gracePeriod=30 Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.631882 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="ceilometer-notification-agent" containerID="cri-o://03ae06f78dc0c07ea23a8c0d99ca4816b7483d6e05af27e8405d42df5a2dcd7d" gracePeriod=30 Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.632042 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.681569 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.217604464 podStartE2EDuration="6.681536987s" podCreationTimestamp="2025-09-29 19:02:26 +0000 UTC" firstStartedPulling="2025-09-29 19:02:27.47164136 +0000 UTC m=+1147.419939404" lastFinishedPulling="2025-09-29 19:02:31.935573883 +0000 UTC m=+1151.883871927" observedRunningTime="2025-09-29 19:02:32.664872035 +0000 UTC m=+1152.613170079" watchObservedRunningTime="2025-09-29 19:02:32.681536987 +0000 UTC m=+1152.629835031" Sep 29 19:02:32 crc kubenswrapper[4780]: I0929 19:02:32.795610 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2" path="/var/lib/kubelet/pods/c8d2d8b1-d162-457f-b7f1-bfd0ee29b6b2/volumes" Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.700499 4780 generic.go:334] "Generic (PLEG): container finished" podID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerID="ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a" exitCode=0 Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.700586 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686fd87d4d-xmdcq" event={"ID":"392fcdb5-646c-4fd3-b2cf-65ced169dfcf","Type":"ContainerDied","Data":"ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a"} Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.706707 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2903cdd8-3ab5-4c85-892c-2139eb0bde7c","Type":"ContainerStarted","Data":"dbcf9928788092ab977ae712ae4612aa67a6f7d49fb7301d908346b1aca4b563"} Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.715226 4780 generic.go:334] "Generic (PLEG): container finished" podID="92561738-6b0f-4afb-86cf-c4b129880383" containerID="80736a7ad3ff0e9af366b896294e7934940a595cef7b82174100610188e8924b" exitCode=0 Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.715272 4780 generic.go:334] "Generic (PLEG): container finished" podID="92561738-6b0f-4afb-86cf-c4b129880383" containerID="32a28a013c0707d5bffc69991008150bc58a3dd41aed0c1be6e11a4c7eab4240" exitCode=2 Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.715281 4780 generic.go:334] "Generic (PLEG): container finished" podID="92561738-6b0f-4afb-86cf-c4b129880383" containerID="03ae06f78dc0c07ea23a8c0d99ca4816b7483d6e05af27e8405d42df5a2dcd7d" exitCode=0 Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.715308 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerDied","Data":"80736a7ad3ff0e9af366b896294e7934940a595cef7b82174100610188e8924b"} Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.715338 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerDied","Data":"32a28a013c0707d5bffc69991008150bc58a3dd41aed0c1be6e11a4c7eab4240"} Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.715350 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerDied","Data":"03ae06f78dc0c07ea23a8c0d99ca4816b7483d6e05af27e8405d42df5a2dcd7d"} Sep 29 19:02:33 crc kubenswrapper[4780]: I0929 19:02:33.743339 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.743308524 podStartE2EDuration="3.743308524s" podCreationTimestamp="2025-09-29 19:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:33.734987139 +0000 UTC m=+1153.683285203" watchObservedRunningTime="2025-09-29 19:02:33.743308524 +0000 UTC m=+1153.691606568" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.018688 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2bngj"] Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.020706 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2bngj" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.033634 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk2wx\" (UniqueName: \"kubernetes.io/projected/1b25d81f-c401-483c-b772-d2570b578c8c-kube-api-access-gk2wx\") pod \"nova-api-db-create-2bngj\" (UID: \"1b25d81f-c401-483c-b772-d2570b578c8c\") " pod="openstack/nova-api-db-create-2bngj" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.037028 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2bngj"] Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.135803 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk2wx\" (UniqueName: \"kubernetes.io/projected/1b25d81f-c401-483c-b772-d2570b578c8c-kube-api-access-gk2wx\") pod \"nova-api-db-create-2bngj\" (UID: \"1b25d81f-c401-483c-b772-d2570b578c8c\") " pod="openstack/nova-api-db-create-2bngj" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.234800 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk2wx\" (UniqueName: \"kubernetes.io/projected/1b25d81f-c401-483c-b772-d2570b578c8c-kube-api-access-gk2wx\") pod \"nova-api-db-create-2bngj\" (UID: \"1b25d81f-c401-483c-b772-d2570b578c8c\") " pod="openstack/nova-api-db-create-2bngj" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.286143 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-dc4k2"] Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.289160 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dc4k2" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.328283 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dc4k2"] Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.360379 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-2dxq9"] Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.362073 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2bngj" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.362902 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2dxq9" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.390683 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2dxq9"] Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.477320 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzt7h\" (UniqueName: \"kubernetes.io/projected/8f7fccbf-70e9-4d7c-9915-97026e49e6b0-kube-api-access-mzt7h\") pod \"nova-cell1-db-create-2dxq9\" (UID: \"8f7fccbf-70e9-4d7c-9915-97026e49e6b0\") " pod="openstack/nova-cell1-db-create-2dxq9" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.477439 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fnmh\" (UniqueName: \"kubernetes.io/projected/2ead435a-236c-441f-bb69-6a3f2d5c88e3-kube-api-access-5fnmh\") pod \"nova-cell0-db-create-dc4k2\" (UID: \"2ead435a-236c-441f-bb69-6a3f2d5c88e3\") " pod="openstack/nova-cell0-db-create-dc4k2" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.579631 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzt7h\" (UniqueName: \"kubernetes.io/projected/8f7fccbf-70e9-4d7c-9915-97026e49e6b0-kube-api-access-mzt7h\") pod \"nova-cell1-db-create-2dxq9\" (UID: \"8f7fccbf-70e9-4d7c-9915-97026e49e6b0\") " pod="openstack/nova-cell1-db-create-2dxq9" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.579788 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fnmh\" (UniqueName: \"kubernetes.io/projected/2ead435a-236c-441f-bb69-6a3f2d5c88e3-kube-api-access-5fnmh\") pod \"nova-cell0-db-create-dc4k2\" (UID: \"2ead435a-236c-441f-bb69-6a3f2d5c88e3\") " pod="openstack/nova-cell0-db-create-dc4k2" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.613076 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzt7h\" (UniqueName: \"kubernetes.io/projected/8f7fccbf-70e9-4d7c-9915-97026e49e6b0-kube-api-access-mzt7h\") pod \"nova-cell1-db-create-2dxq9\" (UID: \"8f7fccbf-70e9-4d7c-9915-97026e49e6b0\") " pod="openstack/nova-cell1-db-create-2dxq9" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.639146 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fnmh\" (UniqueName: \"kubernetes.io/projected/2ead435a-236c-441f-bb69-6a3f2d5c88e3-kube-api-access-5fnmh\") pod \"nova-cell0-db-create-dc4k2\" (UID: \"2ead435a-236c-441f-bb69-6a3f2d5c88e3\") " pod="openstack/nova-cell0-db-create-dc4k2" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.712813 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2dxq9" Sep 29 19:02:34 crc kubenswrapper[4780]: I0929 19:02:34.937781 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dc4k2" Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.023810 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2bngj"] Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.038340 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.038502 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.208100 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.358686 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2dxq9"] Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.631346 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dc4k2"] Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.765879 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2dxq9" event={"ID":"8f7fccbf-70e9-4d7c-9915-97026e49e6b0","Type":"ContainerStarted","Data":"ce83abddb679064c04606622289ed5fd5649a02935122d4ae1834743018dd57e"} Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.767368 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2dxq9" event={"ID":"8f7fccbf-70e9-4d7c-9915-97026e49e6b0","Type":"ContainerStarted","Data":"273e020d36673d432a4ea00d0c09a6eec39c7cba5c8ece366da50037d6cea710"} Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.773708 4780 generic.go:334] "Generic (PLEG): container finished" podID="1b25d81f-c401-483c-b772-d2570b578c8c" containerID="3a9cbda79fe816a7955ce7cbcd0a4685df1c531b028a0f5b21c8ed5f82a66202" exitCode=0 Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.774032 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2bngj" event={"ID":"1b25d81f-c401-483c-b772-d2570b578c8c","Type":"ContainerDied","Data":"3a9cbda79fe816a7955ce7cbcd0a4685df1c531b028a0f5b21c8ed5f82a66202"} Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.774288 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2bngj" event={"ID":"1b25d81f-c401-483c-b772-d2570b578c8c","Type":"ContainerStarted","Data":"b69bcc6d1d0f04fcfb493a0d9e31acb38cca1e828efce280df3de906c7c6f8ea"} Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.781706 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dc4k2" event={"ID":"2ead435a-236c-441f-bb69-6a3f2d5c88e3","Type":"ContainerStarted","Data":"2aa1c64218843dedfd82d252329816d4d399305259a3aca04b2632f1f2082748"} Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.794542 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-2dxq9" podStartSLOduration=1.794514662 podStartE2EDuration="1.794514662s" podCreationTimestamp="2025-09-29 19:02:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:02:35.787686788 +0000 UTC m=+1155.735984832" watchObservedRunningTime="2025-09-29 19:02:35.794514662 +0000 UTC m=+1155.742812706" Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.794692 4780 generic.go:334] "Generic (PLEG): container finished" podID="92561738-6b0f-4afb-86cf-c4b129880383" containerID="82ea9e8eb2a77addfa0b90f1cb3512739bca3354479a9e8624fd56c431aa57a5" exitCode=0 Sep 29 19:02:35 crc kubenswrapper[4780]: I0929 19:02:35.798112 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerDied","Data":"82ea9e8eb2a77addfa0b90f1cb3512739bca3354479a9e8624fd56c431aa57a5"} Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.052551 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.153931 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plpd4\" (UniqueName: \"kubernetes.io/projected/92561738-6b0f-4afb-86cf-c4b129880383-kube-api-access-plpd4\") pod \"92561738-6b0f-4afb-86cf-c4b129880383\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.154782 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-sg-core-conf-yaml\") pod \"92561738-6b0f-4afb-86cf-c4b129880383\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.155310 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-log-httpd\") pod \"92561738-6b0f-4afb-86cf-c4b129880383\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.155345 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-scripts\") pod \"92561738-6b0f-4afb-86cf-c4b129880383\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.155397 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-run-httpd\") pod \"92561738-6b0f-4afb-86cf-c4b129880383\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.155435 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-config-data\") pod \"92561738-6b0f-4afb-86cf-c4b129880383\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.155531 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-combined-ca-bundle\") pod \"92561738-6b0f-4afb-86cf-c4b129880383\" (UID: \"92561738-6b0f-4afb-86cf-c4b129880383\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.158786 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "92561738-6b0f-4afb-86cf-c4b129880383" (UID: "92561738-6b0f-4afb-86cf-c4b129880383"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.159159 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "92561738-6b0f-4afb-86cf-c4b129880383" (UID: "92561738-6b0f-4afb-86cf-c4b129880383"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.160961 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.161002 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92561738-6b0f-4afb-86cf-c4b129880383-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.170325 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92561738-6b0f-4afb-86cf-c4b129880383-kube-api-access-plpd4" (OuterVolumeSpecName: "kube-api-access-plpd4") pod "92561738-6b0f-4afb-86cf-c4b129880383" (UID: "92561738-6b0f-4afb-86cf-c4b129880383"). InnerVolumeSpecName "kube-api-access-plpd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.171177 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-scripts" (OuterVolumeSpecName: "scripts") pod "92561738-6b0f-4afb-86cf-c4b129880383" (UID: "92561738-6b0f-4afb-86cf-c4b129880383"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.189268 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "92561738-6b0f-4afb-86cf-c4b129880383" (UID: "92561738-6b0f-4afb-86cf-c4b129880383"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.257287 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.275188 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.275234 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plpd4\" (UniqueName: \"kubernetes.io/projected/92561738-6b0f-4afb-86cf-c4b129880383-kube-api-access-plpd4\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.275249 4780 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.301588 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92561738-6b0f-4afb-86cf-c4b129880383" (UID: "92561738-6b0f-4afb-86cf-c4b129880383"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.379380 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.396107 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-config-data" (OuterVolumeSpecName: "config-data") pod "92561738-6b0f-4afb-86cf-c4b129880383" (UID: "92561738-6b0f-4afb-86cf-c4b129880383"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.481900 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92561738-6b0f-4afb-86cf-c4b129880383-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.804524 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.807037 4780 generic.go:334] "Generic (PLEG): container finished" podID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerID="8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195" exitCode=0 Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.807101 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686fd87d4d-xmdcq" event={"ID":"392fcdb5-646c-4fd3-b2cf-65ced169dfcf","Type":"ContainerDied","Data":"8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195"} Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.807162 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-686fd87d4d-xmdcq" event={"ID":"392fcdb5-646c-4fd3-b2cf-65ced169dfcf","Type":"ContainerDied","Data":"9a76c11b9e730bd001245d655e4f0f6cf000be8580e41920c4206be241127d0b"} Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.807184 4780 scope.go:117] "RemoveContainer" containerID="ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.815506 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92561738-6b0f-4afb-86cf-c4b129880383","Type":"ContainerDied","Data":"3bda6a3990989c5de963820a7909c40b3ae3cd9664dcb9470952746431a50e29"} Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.815636 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.819912 4780 generic.go:334] "Generic (PLEG): container finished" podID="8f7fccbf-70e9-4d7c-9915-97026e49e6b0" containerID="ce83abddb679064c04606622289ed5fd5649a02935122d4ae1834743018dd57e" exitCode=0 Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.819994 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2dxq9" event={"ID":"8f7fccbf-70e9-4d7c-9915-97026e49e6b0","Type":"ContainerDied","Data":"ce83abddb679064c04606622289ed5fd5649a02935122d4ae1834743018dd57e"} Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.859923 4780 generic.go:334] "Generic (PLEG): container finished" podID="2ead435a-236c-441f-bb69-6a3f2d5c88e3" containerID="97e685ab1f93a82fa5ae729aab612fc064f95fcdbd21f62041a2a74c7c2ea186" exitCode=0 Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.860622 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dc4k2" event={"ID":"2ead435a-236c-441f-bb69-6a3f2d5c88e3","Type":"ContainerDied","Data":"97e685ab1f93a82fa5ae729aab612fc064f95fcdbd21f62041a2a74c7c2ea186"} Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.869349 4780 scope.go:117] "RemoveContainer" containerID="8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.963252 4780 scope.go:117] "RemoveContainer" containerID="ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.963305 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.964452 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 29 19:02:36 crc kubenswrapper[4780]: E0929 19:02:36.968858 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a\": container with ID starting with ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a not found: ID does not exist" containerID="ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.968957 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a"} err="failed to get container status \"ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a\": rpc error: code = NotFound desc = could not find container \"ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a\": container with ID starting with ad58541644de12ae7c22e959f7fde7078bbc277f8ed9bfd83b2e60beec2bb51a not found: ID does not exist" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.969005 4780 scope.go:117] "RemoveContainer" containerID="8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.969171 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:36 crc kubenswrapper[4780]: E0929 19:02:36.969957 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195\": container with ID starting with 8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195 not found: ID does not exist" containerID="8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.969998 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195"} err="failed to get container status \"8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195\": rpc error: code = NotFound desc = could not find container \"8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195\": container with ID starting with 8b1a8ff981560373a78c1ae583a79ea51b16c5c8809e80ce32670fd2a23c1195 not found: ID does not exist" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.970014 4780 scope.go:117] "RemoveContainer" containerID="80736a7ad3ff0e9af366b896294e7934940a595cef7b82174100610188e8924b" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.975879 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.986202 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:36 crc kubenswrapper[4780]: E0929 19:02:36.986828 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="proxy-httpd" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.986856 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="proxy-httpd" Sep 29 19:02:36 crc kubenswrapper[4780]: E0929 19:02:36.986881 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="sg-core" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.986890 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="sg-core" Sep 29 19:02:36 crc kubenswrapper[4780]: E0929 19:02:36.986902 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerName="neutron-httpd" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.986909 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerName="neutron-httpd" Sep 29 19:02:36 crc kubenswrapper[4780]: E0929 19:02:36.986937 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="ceilometer-notification-agent" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.986943 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="ceilometer-notification-agent" Sep 29 19:02:36 crc kubenswrapper[4780]: E0929 19:02:36.986959 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="ceilometer-central-agent" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.986967 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="ceilometer-central-agent" Sep 29 19:02:36 crc kubenswrapper[4780]: E0929 19:02:36.986978 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerName="neutron-api" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.986985 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerName="neutron-api" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.987243 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="proxy-httpd" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.987264 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="ceilometer-central-agent" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.987276 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerName="neutron-api" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.987294 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="ceilometer-notification-agent" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.987307 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" containerName="neutron-httpd" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.987319 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="92561738-6b0f-4afb-86cf-c4b129880383" containerName="sg-core" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.989933 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.994409 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.994613 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.994899 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-ovndb-tls-certs\") pod \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.994980 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-httpd-config\") pod \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.995138 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-config\") pod \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.995164 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-combined-ca-bundle\") pod \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " Sep 29 19:02:36 crc kubenswrapper[4780]: I0929 19:02:36.995214 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjtdg\" (UniqueName: \"kubernetes.io/projected/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-kube-api-access-rjtdg\") pod \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\" (UID: \"392fcdb5-646c-4fd3-b2cf-65ced169dfcf\") " Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.008209 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-kube-api-access-rjtdg" (OuterVolumeSpecName: "kube-api-access-rjtdg") pod "392fcdb5-646c-4fd3-b2cf-65ced169dfcf" (UID: "392fcdb5-646c-4fd3-b2cf-65ced169dfcf"). InnerVolumeSpecName "kube-api-access-rjtdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.012280 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.013437 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "392fcdb5-646c-4fd3-b2cf-65ced169dfcf" (UID: "392fcdb5-646c-4fd3-b2cf-65ced169dfcf"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.078576 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.104032 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.104165 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "392fcdb5-646c-4fd3-b2cf-65ced169dfcf" (UID: "392fcdb5-646c-4fd3-b2cf-65ced169dfcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.131360 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/b50e556e-2c88-4c9b-9493-4e3698b77aa4-kube-api-access-wdjnf\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.131455 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-config-data\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.131483 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-log-httpd\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.131574 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.131712 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.131758 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-run-httpd\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.131847 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-scripts\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.132026 4780 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.132059 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.132071 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjtdg\" (UniqueName: \"kubernetes.io/projected/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-kube-api-access-rjtdg\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.151222 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-config" (OuterVolumeSpecName: "config") pod "392fcdb5-646c-4fd3-b2cf-65ced169dfcf" (UID: "392fcdb5-646c-4fd3-b2cf-65ced169dfcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.169256 4780 scope.go:117] "RemoveContainer" containerID="32a28a013c0707d5bffc69991008150bc58a3dd41aed0c1be6e11a4c7eab4240" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.208968 4780 scope.go:117] "RemoveContainer" containerID="03ae06f78dc0c07ea23a8c0d99ca4816b7483d6e05af27e8405d42df5a2dcd7d" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.233739 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-scripts\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.233864 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/b50e556e-2c88-4c9b-9493-4e3698b77aa4-kube-api-access-wdjnf\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.233893 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-config-data\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.233912 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-log-httpd\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.233940 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.233994 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.234018 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-run-httpd\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.234134 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.234516 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-run-httpd\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.237437 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-log-httpd\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.245930 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.266104 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-config-data\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.266230 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "392fcdb5-646c-4fd3-b2cf-65ced169dfcf" (UID: "392fcdb5-646c-4fd3-b2cf-65ced169dfcf"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.267226 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.267262 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-scripts\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.267564 4780 scope.go:117] "RemoveContainer" containerID="82ea9e8eb2a77addfa0b90f1cb3512739bca3354479a9e8624fd56c431aa57a5" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.282023 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/b50e556e-2c88-4c9b-9493-4e3698b77aa4-kube-api-access-wdjnf\") pod \"ceilometer-0\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.337778 4780 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/392fcdb5-646c-4fd3-b2cf-65ced169dfcf-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.376536 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2bngj" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.439522 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk2wx\" (UniqueName: \"kubernetes.io/projected/1b25d81f-c401-483c-b772-d2570b578c8c-kube-api-access-gk2wx\") pod \"1b25d81f-c401-483c-b772-d2570b578c8c\" (UID: \"1b25d81f-c401-483c-b772-d2570b578c8c\") " Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.444372 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b25d81f-c401-483c-b772-d2570b578c8c-kube-api-access-gk2wx" (OuterVolumeSpecName: "kube-api-access-gk2wx") pod "1b25d81f-c401-483c-b772-d2570b578c8c" (UID: "1b25d81f-c401-483c-b772-d2570b578c8c"). InnerVolumeSpecName "kube-api-access-gk2wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.446031 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.541903 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk2wx\" (UniqueName: \"kubernetes.io/projected/1b25d81f-c401-483c-b772-d2570b578c8c-kube-api-access-gk2wx\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.870740 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2bngj" event={"ID":"1b25d81f-c401-483c-b772-d2570b578c8c","Type":"ContainerDied","Data":"b69bcc6d1d0f04fcfb493a0d9e31acb38cca1e828efce280df3de906c7c6f8ea"} Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.871806 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b69bcc6d1d0f04fcfb493a0d9e31acb38cca1e828efce280df3de906c7c6f8ea" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.870763 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2bngj" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.881661 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-686fd87d4d-xmdcq" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.882470 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.882515 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.947137 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-686fd87d4d-xmdcq"] Sep 29 19:02:37 crc kubenswrapper[4780]: I0929 19:02:37.952226 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-686fd87d4d-xmdcq"] Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.018514 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.337212 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2dxq9" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.349964 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dc4k2" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.394013 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fnmh\" (UniqueName: \"kubernetes.io/projected/2ead435a-236c-441f-bb69-6a3f2d5c88e3-kube-api-access-5fnmh\") pod \"2ead435a-236c-441f-bb69-6a3f2d5c88e3\" (UID: \"2ead435a-236c-441f-bb69-6a3f2d5c88e3\") " Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.394164 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzt7h\" (UniqueName: \"kubernetes.io/projected/8f7fccbf-70e9-4d7c-9915-97026e49e6b0-kube-api-access-mzt7h\") pod \"8f7fccbf-70e9-4d7c-9915-97026e49e6b0\" (UID: \"8f7fccbf-70e9-4d7c-9915-97026e49e6b0\") " Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.407377 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ead435a-236c-441f-bb69-6a3f2d5c88e3-kube-api-access-5fnmh" (OuterVolumeSpecName: "kube-api-access-5fnmh") pod "2ead435a-236c-441f-bb69-6a3f2d5c88e3" (UID: "2ead435a-236c-441f-bb69-6a3f2d5c88e3"). InnerVolumeSpecName "kube-api-access-5fnmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.410921 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f7fccbf-70e9-4d7c-9915-97026e49e6b0-kube-api-access-mzt7h" (OuterVolumeSpecName: "kube-api-access-mzt7h") pod "8f7fccbf-70e9-4d7c-9915-97026e49e6b0" (UID: "8f7fccbf-70e9-4d7c-9915-97026e49e6b0"). InnerVolumeSpecName "kube-api-access-mzt7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.497178 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fnmh\" (UniqueName: \"kubernetes.io/projected/2ead435a-236c-441f-bb69-6a3f2d5c88e3-kube-api-access-5fnmh\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.497541 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzt7h\" (UniqueName: \"kubernetes.io/projected/8f7fccbf-70e9-4d7c-9915-97026e49e6b0-kube-api-access-mzt7h\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.714817 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.776332 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392fcdb5-646c-4fd3-b2cf-65ced169dfcf" path="/var/lib/kubelet/pods/392fcdb5-646c-4fd3-b2cf-65ced169dfcf/volumes" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.779410 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92561738-6b0f-4afb-86cf-c4b129880383" path="/var/lib/kubelet/pods/92561738-6b0f-4afb-86cf-c4b129880383/volumes" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.905341 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2dxq9" event={"ID":"8f7fccbf-70e9-4d7c-9915-97026e49e6b0","Type":"ContainerDied","Data":"273e020d36673d432a4ea00d0c09a6eec39c7cba5c8ece366da50037d6cea710"} Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.905389 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="273e020d36673d432a4ea00d0c09a6eec39c7cba5c8ece366da50037d6cea710" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.905456 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2dxq9" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.912328 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dc4k2" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.912545 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dc4k2" event={"ID":"2ead435a-236c-441f-bb69-6a3f2d5c88e3","Type":"ContainerDied","Data":"2aa1c64218843dedfd82d252329816d4d399305259a3aca04b2632f1f2082748"} Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.912691 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa1c64218843dedfd82d252329816d4d399305259a3aca04b2632f1f2082748" Sep 29 19:02:38 crc kubenswrapper[4780]: I0929 19:02:38.924179 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerStarted","Data":"ea35d159961d706e6d59d3ea85a024f1865edf4a9c9f40b45836ac38fbb58997"} Sep 29 19:02:39 crc kubenswrapper[4780]: I0929 19:02:39.115369 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 29 19:02:39 crc kubenswrapper[4780]: I0929 19:02:39.937242 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerStarted","Data":"7563f62c7d192873664b5539633ac2fde9dc733fcd4ff8fc292dbe541058c7c4"} Sep 29 19:02:39 crc kubenswrapper[4780]: I0929 19:02:39.937331 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 19:02:39 crc kubenswrapper[4780]: I0929 19:02:39.938447 4780 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 29 19:02:40 crc kubenswrapper[4780]: I0929 19:02:40.664351 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 19:02:40 crc kubenswrapper[4780]: I0929 19:02:40.666822 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 29 19:02:40 crc kubenswrapper[4780]: I0929 19:02:40.953420 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerStarted","Data":"db1e0aac2dd316763743b8073f13f5f3d2945134648a553ed038384fa582568b"} Sep 29 19:02:40 crc kubenswrapper[4780]: I0929 19:02:40.953482 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerStarted","Data":"81e3d73027f91ab31aac220a7de46ac4a8630097a4cc4037feb1c64439b2bfce"} Sep 29 19:02:41 crc kubenswrapper[4780]: I0929 19:02:41.572340 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 29 19:02:42 crc kubenswrapper[4780]: I0929 19:02:42.976762 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerStarted","Data":"91ce58230a4d4ba1acda2dffdc220234bf7d53441c137c7d750d474ad2d52511"} Sep 29 19:02:42 crc kubenswrapper[4780]: I0929 19:02:42.977226 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 19:02:42 crc kubenswrapper[4780]: I0929 19:02:42.977039 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="ceilometer-notification-agent" containerID="cri-o://81e3d73027f91ab31aac220a7de46ac4a8630097a4cc4037feb1c64439b2bfce" gracePeriod=30 Sep 29 19:02:42 crc kubenswrapper[4780]: I0929 19:02:42.976928 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="ceilometer-central-agent" containerID="cri-o://7563f62c7d192873664b5539633ac2fde9dc733fcd4ff8fc292dbe541058c7c4" gracePeriod=30 Sep 29 19:02:42 crc kubenswrapper[4780]: I0929 19:02:42.977105 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="proxy-httpd" containerID="cri-o://91ce58230a4d4ba1acda2dffdc220234bf7d53441c137c7d750d474ad2d52511" gracePeriod=30 Sep 29 19:02:42 crc kubenswrapper[4780]: I0929 19:02:42.977113 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="sg-core" containerID="cri-o://db1e0aac2dd316763743b8073f13f5f3d2945134648a553ed038384fa582568b" gracePeriod=30 Sep 29 19:02:43 crc kubenswrapper[4780]: I0929 19:02:43.006426 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.986413759 podStartE2EDuration="7.006406873s" podCreationTimestamp="2025-09-29 19:02:36 +0000 UTC" firstStartedPulling="2025-09-29 19:02:38.112662291 +0000 UTC m=+1158.060960335" lastFinishedPulling="2025-09-29 19:02:42.132655405 +0000 UTC m=+1162.080953449" observedRunningTime="2025-09-29 19:02:43.003990645 +0000 UTC m=+1162.952288709" watchObservedRunningTime="2025-09-29 19:02:43.006406873 +0000 UTC m=+1162.954704917" Sep 29 19:02:43 crc kubenswrapper[4780]: I0929 19:02:43.986930 4780 generic.go:334] "Generic (PLEG): container finished" podID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerID="91ce58230a4d4ba1acda2dffdc220234bf7d53441c137c7d750d474ad2d52511" exitCode=0 Sep 29 19:02:43 crc kubenswrapper[4780]: I0929 19:02:43.987326 4780 generic.go:334] "Generic (PLEG): container finished" podID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerID="db1e0aac2dd316763743b8073f13f5f3d2945134648a553ed038384fa582568b" exitCode=2 Sep 29 19:02:43 crc kubenswrapper[4780]: I0929 19:02:43.987335 4780 generic.go:334] "Generic (PLEG): container finished" podID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerID="81e3d73027f91ab31aac220a7de46ac4a8630097a4cc4037feb1c64439b2bfce" exitCode=0 Sep 29 19:02:43 crc kubenswrapper[4780]: I0929 19:02:43.987343 4780 generic.go:334] "Generic (PLEG): container finished" podID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerID="7563f62c7d192873664b5539633ac2fde9dc733fcd4ff8fc292dbe541058c7c4" exitCode=0 Sep 29 19:02:43 crc kubenswrapper[4780]: I0929 19:02:43.987365 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerDied","Data":"91ce58230a4d4ba1acda2dffdc220234bf7d53441c137c7d750d474ad2d52511"} Sep 29 19:02:43 crc kubenswrapper[4780]: I0929 19:02:43.987393 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerDied","Data":"db1e0aac2dd316763743b8073f13f5f3d2945134648a553ed038384fa582568b"} Sep 29 19:02:43 crc kubenswrapper[4780]: I0929 19:02:43.987404 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerDied","Data":"81e3d73027f91ab31aac220a7de46ac4a8630097a4cc4037feb1c64439b2bfce"} Sep 29 19:02:43 crc kubenswrapper[4780]: I0929 19:02:43.987413 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerDied","Data":"7563f62c7d192873664b5539633ac2fde9dc733fcd4ff8fc292dbe541058c7c4"} Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.150143 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0c9e-account-create-z6hgb"] Sep 29 19:02:44 crc kubenswrapper[4780]: E0929 19:02:44.150703 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ead435a-236c-441f-bb69-6a3f2d5c88e3" containerName="mariadb-database-create" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.150731 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ead435a-236c-441f-bb69-6a3f2d5c88e3" containerName="mariadb-database-create" Sep 29 19:02:44 crc kubenswrapper[4780]: E0929 19:02:44.150746 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7fccbf-70e9-4d7c-9915-97026e49e6b0" containerName="mariadb-database-create" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.150755 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7fccbf-70e9-4d7c-9915-97026e49e6b0" containerName="mariadb-database-create" Sep 29 19:02:44 crc kubenswrapper[4780]: E0929 19:02:44.150797 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b25d81f-c401-483c-b772-d2570b578c8c" containerName="mariadb-database-create" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.150806 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b25d81f-c401-483c-b772-d2570b578c8c" containerName="mariadb-database-create" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.151040 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ead435a-236c-441f-bb69-6a3f2d5c88e3" containerName="mariadb-database-create" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.151085 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f7fccbf-70e9-4d7c-9915-97026e49e6b0" containerName="mariadb-database-create" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.151110 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b25d81f-c401-483c-b772-d2570b578c8c" containerName="mariadb-database-create" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.151837 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0c9e-account-create-z6hgb" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.159522 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.179679 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0c9e-account-create-z6hgb"] Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.346574 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7frb2\" (UniqueName: \"kubernetes.io/projected/fda6e087-f771-4d4c-870a-e6a1c9d1c98c-kube-api-access-7frb2\") pod \"nova-api-0c9e-account-create-z6hgb\" (UID: \"fda6e087-f771-4d4c-870a-e6a1c9d1c98c\") " pod="openstack/nova-api-0c9e-account-create-z6hgb" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.356665 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-48be-account-create-sjthn"] Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.358643 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-48be-account-create-sjthn" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.362641 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.370638 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-48be-account-create-sjthn"] Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.449468 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7frb2\" (UniqueName: \"kubernetes.io/projected/fda6e087-f771-4d4c-870a-e6a1c9d1c98c-kube-api-access-7frb2\") pod \"nova-api-0c9e-account-create-z6hgb\" (UID: \"fda6e087-f771-4d4c-870a-e6a1c9d1c98c\") " pod="openstack/nova-api-0c9e-account-create-z6hgb" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.458561 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.477219 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7frb2\" (UniqueName: \"kubernetes.io/projected/fda6e087-f771-4d4c-870a-e6a1c9d1c98c-kube-api-access-7frb2\") pod \"nova-api-0c9e-account-create-z6hgb\" (UID: \"fda6e087-f771-4d4c-870a-e6a1c9d1c98c\") " pod="openstack/nova-api-0c9e-account-create-z6hgb" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.501719 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0c9e-account-create-z6hgb" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.552535 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftzdj\" (UniqueName: \"kubernetes.io/projected/d1848d52-c01b-4618-bbf7-777cc63f0544-kube-api-access-ftzdj\") pod \"nova-cell1-48be-account-create-sjthn\" (UID: \"d1848d52-c01b-4618-bbf7-777cc63f0544\") " pod="openstack/nova-cell1-48be-account-create-sjthn" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.655337 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/b50e556e-2c88-4c9b-9493-4e3698b77aa4-kube-api-access-wdjnf\") pod \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.655873 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-combined-ca-bundle\") pod \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.655928 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-config-data\") pod \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.655962 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-scripts\") pod \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.656036 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-run-httpd\") pod \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.656126 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-log-httpd\") pod \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.656269 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-sg-core-conf-yaml\") pod \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\" (UID: \"b50e556e-2c88-4c9b-9493-4e3698b77aa4\") " Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.657483 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftzdj\" (UniqueName: \"kubernetes.io/projected/d1848d52-c01b-4618-bbf7-777cc63f0544-kube-api-access-ftzdj\") pod \"nova-cell1-48be-account-create-sjthn\" (UID: \"d1848d52-c01b-4618-bbf7-777cc63f0544\") " pod="openstack/nova-cell1-48be-account-create-sjthn" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.663623 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b50e556e-2c88-4c9b-9493-4e3698b77aa4" (UID: "b50e556e-2c88-4c9b-9493-4e3698b77aa4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.663845 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b50e556e-2c88-4c9b-9493-4e3698b77aa4" (UID: "b50e556e-2c88-4c9b-9493-4e3698b77aa4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.663940 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50e556e-2c88-4c9b-9493-4e3698b77aa4-kube-api-access-wdjnf" (OuterVolumeSpecName: "kube-api-access-wdjnf") pod "b50e556e-2c88-4c9b-9493-4e3698b77aa4" (UID: "b50e556e-2c88-4c9b-9493-4e3698b77aa4"). InnerVolumeSpecName "kube-api-access-wdjnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.667355 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-scripts" (OuterVolumeSpecName: "scripts") pod "b50e556e-2c88-4c9b-9493-4e3698b77aa4" (UID: "b50e556e-2c88-4c9b-9493-4e3698b77aa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.679574 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftzdj\" (UniqueName: \"kubernetes.io/projected/d1848d52-c01b-4618-bbf7-777cc63f0544-kube-api-access-ftzdj\") pod \"nova-cell1-48be-account-create-sjthn\" (UID: \"d1848d52-c01b-4618-bbf7-777cc63f0544\") " pod="openstack/nova-cell1-48be-account-create-sjthn" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.708349 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b50e556e-2c88-4c9b-9493-4e3698b77aa4" (UID: "b50e556e-2c88-4c9b-9493-4e3698b77aa4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.759196 4780 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.759227 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdjnf\" (UniqueName: \"kubernetes.io/projected/b50e556e-2c88-4c9b-9493-4e3698b77aa4-kube-api-access-wdjnf\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.759240 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.759250 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.759259 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b50e556e-2c88-4c9b-9493-4e3698b77aa4-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.769935 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b50e556e-2c88-4c9b-9493-4e3698b77aa4" (UID: "b50e556e-2c88-4c9b-9493-4e3698b77aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.800997 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-config-data" (OuterVolumeSpecName: "config-data") pod "b50e556e-2c88-4c9b-9493-4e3698b77aa4" (UID: "b50e556e-2c88-4c9b-9493-4e3698b77aa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.814002 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-48be-account-create-sjthn" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.862026 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.862081 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b50e556e-2c88-4c9b-9493-4e3698b77aa4-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:44 crc kubenswrapper[4780]: I0929 19:02:44.970898 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0c9e-account-create-z6hgb"] Sep 29 19:02:44 crc kubenswrapper[4780]: W0929 19:02:44.982849 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfda6e087_f771_4d4c_870a_e6a1c9d1c98c.slice/crio-1ecdc6a6617c9ec7e838943d02e7ff6ecae531d4bfca14372346e21de8ed36c5 WatchSource:0}: Error finding container 1ecdc6a6617c9ec7e838943d02e7ff6ecae531d4bfca14372346e21de8ed36c5: Status 404 returned error can't find the container with id 1ecdc6a6617c9ec7e838943d02e7ff6ecae531d4bfca14372346e21de8ed36c5 Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.002728 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0c9e-account-create-z6hgb" event={"ID":"fda6e087-f771-4d4c-870a-e6a1c9d1c98c","Type":"ContainerStarted","Data":"1ecdc6a6617c9ec7e838943d02e7ff6ecae531d4bfca14372346e21de8ed36c5"} Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.008404 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b50e556e-2c88-4c9b-9493-4e3698b77aa4","Type":"ContainerDied","Data":"ea35d159961d706e6d59d3ea85a024f1865edf4a9c9f40b45836ac38fbb58997"} Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.008762 4780 scope.go:117] "RemoveContainer" containerID="91ce58230a4d4ba1acda2dffdc220234bf7d53441c137c7d750d474ad2d52511" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.008885 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.053211 4780 scope.go:117] "RemoveContainer" containerID="db1e0aac2dd316763743b8073f13f5f3d2945134648a553ed038384fa582568b" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.058548 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.083792 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.088300 4780 scope.go:117] "RemoveContainer" containerID="81e3d73027f91ab31aac220a7de46ac4a8630097a4cc4037feb1c64439b2bfce" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.096441 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:45 crc kubenswrapper[4780]: E0929 19:02:45.097174 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="sg-core" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.097193 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="sg-core" Sep 29 19:02:45 crc kubenswrapper[4780]: E0929 19:02:45.097213 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="ceilometer-central-agent" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.097224 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="ceilometer-central-agent" Sep 29 19:02:45 crc kubenswrapper[4780]: E0929 19:02:45.097238 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="ceilometer-notification-agent" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.097245 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="ceilometer-notification-agent" Sep 29 19:02:45 crc kubenswrapper[4780]: E0929 19:02:45.097265 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="proxy-httpd" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.097273 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="proxy-httpd" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.097570 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="sg-core" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.097593 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="ceilometer-central-agent" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.097602 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="ceilometer-notification-agent" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.097614 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" containerName="proxy-httpd" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.099987 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.101899 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.102318 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.114868 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.121087 4780 scope.go:117] "RemoveContainer" containerID="7563f62c7d192873664b5539633ac2fde9dc733fcd4ff8fc292dbe541058c7c4" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.270960 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-run-httpd\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.271162 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.271236 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-log-httpd\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.271339 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmv9m\" (UniqueName: \"kubernetes.io/projected/6333d251-2637-4233-8ce3-2f327c721f7c-kube-api-access-jmv9m\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.271388 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-scripts\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.271435 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-config-data\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.271487 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.279560 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-48be-account-create-sjthn"] Sep 29 19:02:45 crc kubenswrapper[4780]: W0929 19:02:45.287615 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1848d52_c01b_4618_bbf7_777cc63f0544.slice/crio-0625e9c8d08fb8b35d4d0a80cdb4b35d454f394610fa868d4f84b9711aa2cb4b WatchSource:0}: Error finding container 0625e9c8d08fb8b35d4d0a80cdb4b35d454f394610fa868d4f84b9711aa2cb4b: Status 404 returned error can't find the container with id 0625e9c8d08fb8b35d4d0a80cdb4b35d454f394610fa868d4f84b9711aa2cb4b Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.373449 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmv9m\" (UniqueName: \"kubernetes.io/projected/6333d251-2637-4233-8ce3-2f327c721f7c-kube-api-access-jmv9m\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.373494 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-scripts\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.373535 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-config-data\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.373567 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.373611 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-run-httpd\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.373663 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.373703 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-log-httpd\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.374187 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-log-httpd\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.374445 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-run-httpd\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.380098 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-scripts\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.380151 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.380472 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.380956 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-config-data\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.392989 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmv9m\" (UniqueName: \"kubernetes.io/projected/6333d251-2637-4233-8ce3-2f327c721f7c-kube-api-access-jmv9m\") pod \"ceilometer-0\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.435628 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:45 crc kubenswrapper[4780]: I0929 19:02:45.901492 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:45 crc kubenswrapper[4780]: W0929 19:02:45.910257 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6333d251_2637_4233_8ce3_2f327c721f7c.slice/crio-aa3e249151c650dfeca35db154ccf66338e6d8168cb27b2cb257879c9299ca2b WatchSource:0}: Error finding container aa3e249151c650dfeca35db154ccf66338e6d8168cb27b2cb257879c9299ca2b: Status 404 returned error can't find the container with id aa3e249151c650dfeca35db154ccf66338e6d8168cb27b2cb257879c9299ca2b Sep 29 19:02:46 crc kubenswrapper[4780]: I0929 19:02:46.018528 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerStarted","Data":"aa3e249151c650dfeca35db154ccf66338e6d8168cb27b2cb257879c9299ca2b"} Sep 29 19:02:46 crc kubenswrapper[4780]: I0929 19:02:46.022127 4780 generic.go:334] "Generic (PLEG): container finished" podID="d1848d52-c01b-4618-bbf7-777cc63f0544" containerID="032036dc95afbee3502dbfcf272cd78266cf379155bf2e1bb4815adf6cb65e9a" exitCode=0 Sep 29 19:02:46 crc kubenswrapper[4780]: I0929 19:02:46.022176 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-48be-account-create-sjthn" event={"ID":"d1848d52-c01b-4618-bbf7-777cc63f0544","Type":"ContainerDied","Data":"032036dc95afbee3502dbfcf272cd78266cf379155bf2e1bb4815adf6cb65e9a"} Sep 29 19:02:46 crc kubenswrapper[4780]: I0929 19:02:46.022217 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-48be-account-create-sjthn" event={"ID":"d1848d52-c01b-4618-bbf7-777cc63f0544","Type":"ContainerStarted","Data":"0625e9c8d08fb8b35d4d0a80cdb4b35d454f394610fa868d4f84b9711aa2cb4b"} Sep 29 19:02:46 crc kubenswrapper[4780]: I0929 19:02:46.025689 4780 generic.go:334] "Generic (PLEG): container finished" podID="fda6e087-f771-4d4c-870a-e6a1c9d1c98c" containerID="30a39734b4259b0e3f9e44b9d713ecae2d2861e0a690d22cc730cca75a36853d" exitCode=0 Sep 29 19:02:46 crc kubenswrapper[4780]: I0929 19:02:46.025747 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0c9e-account-create-z6hgb" event={"ID":"fda6e087-f771-4d4c-870a-e6a1c9d1c98c","Type":"ContainerDied","Data":"30a39734b4259b0e3f9e44b9d713ecae2d2861e0a690d22cc730cca75a36853d"} Sep 29 19:02:46 crc kubenswrapper[4780]: I0929 19:02:46.763419 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50e556e-2c88-4c9b-9493-4e3698b77aa4" path="/var/lib/kubelet/pods/b50e556e-2c88-4c9b-9493-4e3698b77aa4/volumes" Sep 29 19:02:47 crc kubenswrapper[4780]: I0929 19:02:47.373396 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-48be-account-create-sjthn" Sep 29 19:02:47 crc kubenswrapper[4780]: I0929 19:02:47.484319 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0c9e-account-create-z6hgb" Sep 29 19:02:47 crc kubenswrapper[4780]: I0929 19:02:47.514793 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftzdj\" (UniqueName: \"kubernetes.io/projected/d1848d52-c01b-4618-bbf7-777cc63f0544-kube-api-access-ftzdj\") pod \"d1848d52-c01b-4618-bbf7-777cc63f0544\" (UID: \"d1848d52-c01b-4618-bbf7-777cc63f0544\") " Sep 29 19:02:47 crc kubenswrapper[4780]: I0929 19:02:47.524527 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1848d52-c01b-4618-bbf7-777cc63f0544-kube-api-access-ftzdj" (OuterVolumeSpecName: "kube-api-access-ftzdj") pod "d1848d52-c01b-4618-bbf7-777cc63f0544" (UID: "d1848d52-c01b-4618-bbf7-777cc63f0544"). InnerVolumeSpecName "kube-api-access-ftzdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:47 crc kubenswrapper[4780]: I0929 19:02:47.617436 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7frb2\" (UniqueName: \"kubernetes.io/projected/fda6e087-f771-4d4c-870a-e6a1c9d1c98c-kube-api-access-7frb2\") pod \"fda6e087-f771-4d4c-870a-e6a1c9d1c98c\" (UID: \"fda6e087-f771-4d4c-870a-e6a1c9d1c98c\") " Sep 29 19:02:47 crc kubenswrapper[4780]: I0929 19:02:47.618375 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftzdj\" (UniqueName: \"kubernetes.io/projected/d1848d52-c01b-4618-bbf7-777cc63f0544-kube-api-access-ftzdj\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:47 crc kubenswrapper[4780]: I0929 19:02:47.623592 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda6e087-f771-4d4c-870a-e6a1c9d1c98c-kube-api-access-7frb2" (OuterVolumeSpecName: "kube-api-access-7frb2") pod "fda6e087-f771-4d4c-870a-e6a1c9d1c98c" (UID: "fda6e087-f771-4d4c-870a-e6a1c9d1c98c"). InnerVolumeSpecName "kube-api-access-7frb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:47 crc kubenswrapper[4780]: I0929 19:02:47.725001 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7frb2\" (UniqueName: \"kubernetes.io/projected/fda6e087-f771-4d4c-870a-e6a1c9d1c98c-kube-api-access-7frb2\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:48 crc kubenswrapper[4780]: I0929 19:02:48.052810 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerStarted","Data":"2959eb04dccc5e9c9edec62edeec16dd6c4c5155b26b0296c3795d7c0ad74543"} Sep 29 19:02:48 crc kubenswrapper[4780]: I0929 19:02:48.056120 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-48be-account-create-sjthn" event={"ID":"d1848d52-c01b-4618-bbf7-777cc63f0544","Type":"ContainerDied","Data":"0625e9c8d08fb8b35d4d0a80cdb4b35d454f394610fa868d4f84b9711aa2cb4b"} Sep 29 19:02:48 crc kubenswrapper[4780]: I0929 19:02:48.056151 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0625e9c8d08fb8b35d4d0a80cdb4b35d454f394610fa868d4f84b9711aa2cb4b" Sep 29 19:02:48 crc kubenswrapper[4780]: I0929 19:02:48.056208 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-48be-account-create-sjthn" Sep 29 19:02:48 crc kubenswrapper[4780]: I0929 19:02:48.059839 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0c9e-account-create-z6hgb" Sep 29 19:02:48 crc kubenswrapper[4780]: I0929 19:02:48.069744 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0c9e-account-create-z6hgb" event={"ID":"fda6e087-f771-4d4c-870a-e6a1c9d1c98c","Type":"ContainerDied","Data":"1ecdc6a6617c9ec7e838943d02e7ff6ecae531d4bfca14372346e21de8ed36c5"} Sep 29 19:02:48 crc kubenswrapper[4780]: I0929 19:02:48.069849 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ecdc6a6617c9ec7e838943d02e7ff6ecae531d4bfca14372346e21de8ed36c5" Sep 29 19:02:49 crc kubenswrapper[4780]: I0929 19:02:49.072007 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerStarted","Data":"a8088582749e7b6cbb2da8c17e1cc6828e49374dd7627f985b8acfcbf1b00b93"} Sep 29 19:02:50 crc kubenswrapper[4780]: I0929 19:02:50.085779 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerStarted","Data":"fd45bac038f9ce6e79d06958b870f74ccf515d12a4f5526314e65557364ee13f"} Sep 29 19:02:52 crc kubenswrapper[4780]: I0929 19:02:52.116369 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerStarted","Data":"ebd47cb7f1791242ca19b24e013c1f46a4a88f4cd6f7477e020764370a1e0e77"} Sep 29 19:02:52 crc kubenswrapper[4780]: I0929 19:02:52.117346 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 19:02:52 crc kubenswrapper[4780]: I0929 19:02:52.155272 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.760194465 podStartE2EDuration="7.155247991s" podCreationTimestamp="2025-09-29 19:02:45 +0000 UTC" firstStartedPulling="2025-09-29 19:02:45.913547885 +0000 UTC m=+1165.861845929" lastFinishedPulling="2025-09-29 19:02:51.308601411 +0000 UTC m=+1171.256899455" observedRunningTime="2025-09-29 19:02:52.153910193 +0000 UTC m=+1172.102208237" watchObservedRunningTime="2025-09-29 19:02:52.155247991 +0000 UTC m=+1172.103546025" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.285865 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-63ac-account-create-l5pps"] Sep 29 19:02:54 crc kubenswrapper[4780]: E0929 19:02:54.286660 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1848d52-c01b-4618-bbf7-777cc63f0544" containerName="mariadb-account-create" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.286677 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1848d52-c01b-4618-bbf7-777cc63f0544" containerName="mariadb-account-create" Sep 29 19:02:54 crc kubenswrapper[4780]: E0929 19:02:54.286695 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda6e087-f771-4d4c-870a-e6a1c9d1c98c" containerName="mariadb-account-create" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.286702 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda6e087-f771-4d4c-870a-e6a1c9d1c98c" containerName="mariadb-account-create" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.286925 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1848d52-c01b-4618-bbf7-777cc63f0544" containerName="mariadb-account-create" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.286940 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda6e087-f771-4d4c-870a-e6a1c9d1c98c" containerName="mariadb-account-create" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.287705 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-63ac-account-create-l5pps" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.290351 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.310301 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-63ac-account-create-l5pps"] Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.362987 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f77k\" (UniqueName: \"kubernetes.io/projected/d6407027-1b5b-454a-83b1-d08d03e5af9c-kube-api-access-2f77k\") pod \"nova-cell0-63ac-account-create-l5pps\" (UID: \"d6407027-1b5b-454a-83b1-d08d03e5af9c\") " pod="openstack/nova-cell0-63ac-account-create-l5pps" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.464398 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f77k\" (UniqueName: \"kubernetes.io/projected/d6407027-1b5b-454a-83b1-d08d03e5af9c-kube-api-access-2f77k\") pod \"nova-cell0-63ac-account-create-l5pps\" (UID: \"d6407027-1b5b-454a-83b1-d08d03e5af9c\") " pod="openstack/nova-cell0-63ac-account-create-l5pps" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.489004 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f77k\" (UniqueName: \"kubernetes.io/projected/d6407027-1b5b-454a-83b1-d08d03e5af9c-kube-api-access-2f77k\") pod \"nova-cell0-63ac-account-create-l5pps\" (UID: \"d6407027-1b5b-454a-83b1-d08d03e5af9c\") " pod="openstack/nova-cell0-63ac-account-create-l5pps" Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.547458 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.547787 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="ceilometer-central-agent" containerID="cri-o://2959eb04dccc5e9c9edec62edeec16dd6c4c5155b26b0296c3795d7c0ad74543" gracePeriod=30 Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.547853 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="ceilometer-notification-agent" containerID="cri-o://a8088582749e7b6cbb2da8c17e1cc6828e49374dd7627f985b8acfcbf1b00b93" gracePeriod=30 Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.547813 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="proxy-httpd" containerID="cri-o://ebd47cb7f1791242ca19b24e013c1f46a4a88f4cd6f7477e020764370a1e0e77" gracePeriod=30 Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.547925 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="sg-core" containerID="cri-o://fd45bac038f9ce6e79d06958b870f74ccf515d12a4f5526314e65557364ee13f" gracePeriod=30 Sep 29 19:02:54 crc kubenswrapper[4780]: I0929 19:02:54.625742 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-63ac-account-create-l5pps" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.117134 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-63ac-account-create-l5pps"] Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.159398 4780 generic.go:334] "Generic (PLEG): container finished" podID="6333d251-2637-4233-8ce3-2f327c721f7c" containerID="ebd47cb7f1791242ca19b24e013c1f46a4a88f4cd6f7477e020764370a1e0e77" exitCode=0 Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.159461 4780 generic.go:334] "Generic (PLEG): container finished" podID="6333d251-2637-4233-8ce3-2f327c721f7c" containerID="fd45bac038f9ce6e79d06958b870f74ccf515d12a4f5526314e65557364ee13f" exitCode=2 Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.159469 4780 generic.go:334] "Generic (PLEG): container finished" podID="6333d251-2637-4233-8ce3-2f327c721f7c" containerID="a8088582749e7b6cbb2da8c17e1cc6828e49374dd7627f985b8acfcbf1b00b93" exitCode=0 Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.159476 4780 generic.go:334] "Generic (PLEG): container finished" podID="6333d251-2637-4233-8ce3-2f327c721f7c" containerID="2959eb04dccc5e9c9edec62edeec16dd6c4c5155b26b0296c3795d7c0ad74543" exitCode=0 Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.159478 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerDied","Data":"ebd47cb7f1791242ca19b24e013c1f46a4a88f4cd6f7477e020764370a1e0e77"} Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.159534 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerDied","Data":"fd45bac038f9ce6e79d06958b870f74ccf515d12a4f5526314e65557364ee13f"} Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.159545 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerDied","Data":"a8088582749e7b6cbb2da8c17e1cc6828e49374dd7627f985b8acfcbf1b00b93"} Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.159557 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerDied","Data":"2959eb04dccc5e9c9edec62edeec16dd6c4c5155b26b0296c3795d7c0ad74543"} Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.332585 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.491276 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmv9m\" (UniqueName: \"kubernetes.io/projected/6333d251-2637-4233-8ce3-2f327c721f7c-kube-api-access-jmv9m\") pod \"6333d251-2637-4233-8ce3-2f327c721f7c\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.491373 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-config-data\") pod \"6333d251-2637-4233-8ce3-2f327c721f7c\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.491449 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-scripts\") pod \"6333d251-2637-4233-8ce3-2f327c721f7c\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.491502 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-run-httpd\") pod \"6333d251-2637-4233-8ce3-2f327c721f7c\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.491550 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-combined-ca-bundle\") pod \"6333d251-2637-4233-8ce3-2f327c721f7c\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.491682 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-sg-core-conf-yaml\") pod \"6333d251-2637-4233-8ce3-2f327c721f7c\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.491705 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-log-httpd\") pod \"6333d251-2637-4233-8ce3-2f327c721f7c\" (UID: \"6333d251-2637-4233-8ce3-2f327c721f7c\") " Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.492667 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6333d251-2637-4233-8ce3-2f327c721f7c" (UID: "6333d251-2637-4233-8ce3-2f327c721f7c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.492806 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6333d251-2637-4233-8ce3-2f327c721f7c" (UID: "6333d251-2637-4233-8ce3-2f327c721f7c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.497380 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-scripts" (OuterVolumeSpecName: "scripts") pod "6333d251-2637-4233-8ce3-2f327c721f7c" (UID: "6333d251-2637-4233-8ce3-2f327c721f7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.497549 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6333d251-2637-4233-8ce3-2f327c721f7c-kube-api-access-jmv9m" (OuterVolumeSpecName: "kube-api-access-jmv9m") pod "6333d251-2637-4233-8ce3-2f327c721f7c" (UID: "6333d251-2637-4233-8ce3-2f327c721f7c"). InnerVolumeSpecName "kube-api-access-jmv9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.519945 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6333d251-2637-4233-8ce3-2f327c721f7c" (UID: "6333d251-2637-4233-8ce3-2f327c721f7c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.580704 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6333d251-2637-4233-8ce3-2f327c721f7c" (UID: "6333d251-2637-4233-8ce3-2f327c721f7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.594825 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmv9m\" (UniqueName: \"kubernetes.io/projected/6333d251-2637-4233-8ce3-2f327c721f7c-kube-api-access-jmv9m\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.594881 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.594902 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.594920 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.594937 4780 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.594954 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6333d251-2637-4233-8ce3-2f327c721f7c-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.620468 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-config-data" (OuterVolumeSpecName: "config-data") pod "6333d251-2637-4233-8ce3-2f327c721f7c" (UID: "6333d251-2637-4233-8ce3-2f327c721f7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:02:55 crc kubenswrapper[4780]: I0929 19:02:55.696725 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333d251-2637-4233-8ce3-2f327c721f7c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.173256 4780 generic.go:334] "Generic (PLEG): container finished" podID="d6407027-1b5b-454a-83b1-d08d03e5af9c" containerID="3b46613406d996c1017570f6009c858346fa16fd0adca84ce6a74bbfd6caff42" exitCode=0 Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.173338 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-63ac-account-create-l5pps" event={"ID":"d6407027-1b5b-454a-83b1-d08d03e5af9c","Type":"ContainerDied","Data":"3b46613406d996c1017570f6009c858346fa16fd0adca84ce6a74bbfd6caff42"} Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.173371 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-63ac-account-create-l5pps" event={"ID":"d6407027-1b5b-454a-83b1-d08d03e5af9c","Type":"ContainerStarted","Data":"27df04d2aeabd83e9572bd71caa01d60081c4788f65957d87f0c7976ea4b868a"} Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.176803 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6333d251-2637-4233-8ce3-2f327c721f7c","Type":"ContainerDied","Data":"aa3e249151c650dfeca35db154ccf66338e6d8168cb27b2cb257879c9299ca2b"} Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.176879 4780 scope.go:117] "RemoveContainer" containerID="ebd47cb7f1791242ca19b24e013c1f46a4a88f4cd6f7477e020764370a1e0e77" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.177174 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.209405 4780 scope.go:117] "RemoveContainer" containerID="fd45bac038f9ce6e79d06958b870f74ccf515d12a4f5526314e65557364ee13f" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.228980 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.230017 4780 scope.go:117] "RemoveContainer" containerID="a8088582749e7b6cbb2da8c17e1cc6828e49374dd7627f985b8acfcbf1b00b93" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.238523 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.263032 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:56 crc kubenswrapper[4780]: E0929 19:02:56.263660 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="ceilometer-central-agent" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.263682 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="ceilometer-central-agent" Sep 29 19:02:56 crc kubenswrapper[4780]: E0929 19:02:56.263691 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="proxy-httpd" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.263698 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="proxy-httpd" Sep 29 19:02:56 crc kubenswrapper[4780]: E0929 19:02:56.263707 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="ceilometer-notification-agent" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.263713 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="ceilometer-notification-agent" Sep 29 19:02:56 crc kubenswrapper[4780]: E0929 19:02:56.263740 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="sg-core" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.263746 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="sg-core" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.263927 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="ceilometer-central-agent" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.263942 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="proxy-httpd" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.263962 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="ceilometer-notification-agent" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.263969 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" containerName="sg-core" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.265846 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.269430 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.271237 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.274732 4780 scope.go:117] "RemoveContainer" containerID="2959eb04dccc5e9c9edec62edeec16dd6c4c5155b26b0296c3795d7c0ad74543" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.276378 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.409151 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.411298 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-log-httpd\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.411566 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.412855 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdrg5\" (UniqueName: \"kubernetes.io/projected/58d17dc9-2c5c-4556-a642-c2de0881d879-kube-api-access-cdrg5\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.413274 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-config-data\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.413494 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-scripts\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.413624 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-run-httpd\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.514893 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-scripts\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.515934 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-run-httpd\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.516076 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.516183 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-log-httpd\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.516295 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.516404 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdrg5\" (UniqueName: \"kubernetes.io/projected/58d17dc9-2c5c-4556-a642-c2de0881d879-kube-api-access-cdrg5\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.516633 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-config-data\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.516756 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-log-httpd\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.517220 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-run-httpd\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.521913 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.521931 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.524829 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-config-data\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.535321 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-scripts\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.535935 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdrg5\" (UniqueName: \"kubernetes.io/projected/58d17dc9-2c5c-4556-a642-c2de0881d879-kube-api-access-cdrg5\") pod \"ceilometer-0\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.590370 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:02:56 crc kubenswrapper[4780]: I0929 19:02:56.763700 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6333d251-2637-4233-8ce3-2f327c721f7c" path="/var/lib/kubelet/pods/6333d251-2637-4233-8ce3-2f327c721f7c/volumes" Sep 29 19:02:57 crc kubenswrapper[4780]: I0929 19:02:57.088878 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:02:57 crc kubenswrapper[4780]: W0929 19:02:57.089689 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58d17dc9_2c5c_4556_a642_c2de0881d879.slice/crio-c42d25a55df1cf87962c12d31f7a1eb1bfa9f663787c7f9005f8ee9e812c5054 WatchSource:0}: Error finding container c42d25a55df1cf87962c12d31f7a1eb1bfa9f663787c7f9005f8ee9e812c5054: Status 404 returned error can't find the container with id c42d25a55df1cf87962c12d31f7a1eb1bfa9f663787c7f9005f8ee9e812c5054 Sep 29 19:02:57 crc kubenswrapper[4780]: I0929 19:02:57.190267 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerStarted","Data":"c42d25a55df1cf87962c12d31f7a1eb1bfa9f663787c7f9005f8ee9e812c5054"} Sep 29 19:02:57 crc kubenswrapper[4780]: I0929 19:02:57.501033 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-63ac-account-create-l5pps" Sep 29 19:02:57 crc kubenswrapper[4780]: I0929 19:02:57.641941 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f77k\" (UniqueName: \"kubernetes.io/projected/d6407027-1b5b-454a-83b1-d08d03e5af9c-kube-api-access-2f77k\") pod \"d6407027-1b5b-454a-83b1-d08d03e5af9c\" (UID: \"d6407027-1b5b-454a-83b1-d08d03e5af9c\") " Sep 29 19:02:57 crc kubenswrapper[4780]: I0929 19:02:57.647324 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6407027-1b5b-454a-83b1-d08d03e5af9c-kube-api-access-2f77k" (OuterVolumeSpecName: "kube-api-access-2f77k") pod "d6407027-1b5b-454a-83b1-d08d03e5af9c" (UID: "d6407027-1b5b-454a-83b1-d08d03e5af9c"). InnerVolumeSpecName "kube-api-access-2f77k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:02:57 crc kubenswrapper[4780]: I0929 19:02:57.749362 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f77k\" (UniqueName: \"kubernetes.io/projected/d6407027-1b5b-454a-83b1-d08d03e5af9c-kube-api-access-2f77k\") on node \"crc\" DevicePath \"\"" Sep 29 19:02:58 crc kubenswrapper[4780]: I0929 19:02:58.202914 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-63ac-account-create-l5pps" event={"ID":"d6407027-1b5b-454a-83b1-d08d03e5af9c","Type":"ContainerDied","Data":"27df04d2aeabd83e9572bd71caa01d60081c4788f65957d87f0c7976ea4b868a"} Sep 29 19:02:58 crc kubenswrapper[4780]: I0929 19:02:58.203562 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27df04d2aeabd83e9572bd71caa01d60081c4788f65957d87f0c7976ea4b868a" Sep 29 19:02:58 crc kubenswrapper[4780]: I0929 19:02:58.203014 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-63ac-account-create-l5pps" Sep 29 19:02:58 crc kubenswrapper[4780]: I0929 19:02:58.205105 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerStarted","Data":"b55f8e909343af09dab44da32dd6bc4d3ed0d35e7ee48a4beec5a40cd2637b03"} Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.215740 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerStarted","Data":"31227d4db0dfb6a05800c015c34bdd6d3daa014b0a5ba6b93b2fb45f5e76d418"} Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.573890 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58hpf"] Sep 29 19:02:59 crc kubenswrapper[4780]: E0929 19:02:59.574402 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6407027-1b5b-454a-83b1-d08d03e5af9c" containerName="mariadb-account-create" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.574419 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6407027-1b5b-454a-83b1-d08d03e5af9c" containerName="mariadb-account-create" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.574611 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6407027-1b5b-454a-83b1-d08d03e5af9c" containerName="mariadb-account-create" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.575322 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.578771 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.578942 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.578770 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tdc5j" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.592412 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58hpf"] Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.690153 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.690249 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-scripts\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.690278 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-config-data\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.690294 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75xd9\" (UniqueName: \"kubernetes.io/projected/f2087ead-f015-40af-b172-6cf166a01cf6-kube-api-access-75xd9\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.792280 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.792418 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-scripts\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.792463 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-config-data\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.792491 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75xd9\" (UniqueName: \"kubernetes.io/projected/f2087ead-f015-40af-b172-6cf166a01cf6-kube-api-access-75xd9\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.797246 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-scripts\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.797869 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.798314 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-config-data\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.811787 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75xd9\" (UniqueName: \"kubernetes.io/projected/f2087ead-f015-40af-b172-6cf166a01cf6-kube-api-access-75xd9\") pod \"nova-cell0-conductor-db-sync-58hpf\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:02:59 crc kubenswrapper[4780]: I0929 19:02:59.897334 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:03:00 crc kubenswrapper[4780]: I0929 19:03:00.233266 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerStarted","Data":"948eda3cd40dd68ede38309f69bb61d0543dcf5aeeef88cd6afa247d52193183"} Sep 29 19:03:00 crc kubenswrapper[4780]: I0929 19:03:00.405387 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58hpf"] Sep 29 19:03:01 crc kubenswrapper[4780]: I0929 19:03:01.243673 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-58hpf" event={"ID":"f2087ead-f015-40af-b172-6cf166a01cf6","Type":"ContainerStarted","Data":"75ca96ebdbb93077e1c3c0d748c7e1ce69ea7e2f53b3f3e1a0443a1e5e9830fc"} Sep 29 19:03:02 crc kubenswrapper[4780]: I0929 19:03:02.259446 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerStarted","Data":"92ee612a44a7f3b4d4dd0cb202407c29f9ce68fd74231e477d2b55cc0f7f17b3"} Sep 29 19:03:02 crc kubenswrapper[4780]: I0929 19:03:02.260593 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 19:03:02 crc kubenswrapper[4780]: I0929 19:03:02.286468 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.122003289 podStartE2EDuration="6.28644955s" podCreationTimestamp="2025-09-29 19:02:56 +0000 UTC" firstStartedPulling="2025-09-29 19:02:57.092285229 +0000 UTC m=+1177.040583283" lastFinishedPulling="2025-09-29 19:03:01.2567315 +0000 UTC m=+1181.205029544" observedRunningTime="2025-09-29 19:03:02.283008983 +0000 UTC m=+1182.231307027" watchObservedRunningTime="2025-09-29 19:03:02.28644955 +0000 UTC m=+1182.234747594" Sep 29 19:03:04 crc kubenswrapper[4780]: I0929 19:03:04.221272 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:04 crc kubenswrapper[4780]: I0929 19:03:04.283694 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="ceilometer-central-agent" containerID="cri-o://b55f8e909343af09dab44da32dd6bc4d3ed0d35e7ee48a4beec5a40cd2637b03" gracePeriod=30 Sep 29 19:03:04 crc kubenswrapper[4780]: I0929 19:03:04.283716 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="proxy-httpd" containerID="cri-o://92ee612a44a7f3b4d4dd0cb202407c29f9ce68fd74231e477d2b55cc0f7f17b3" gracePeriod=30 Sep 29 19:03:04 crc kubenswrapper[4780]: I0929 19:03:04.283867 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="ceilometer-notification-agent" containerID="cri-o://31227d4db0dfb6a05800c015c34bdd6d3daa014b0a5ba6b93b2fb45f5e76d418" gracePeriod=30 Sep 29 19:03:04 crc kubenswrapper[4780]: I0929 19:03:04.283917 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="sg-core" containerID="cri-o://948eda3cd40dd68ede38309f69bb61d0543dcf5aeeef88cd6afa247d52193183" gracePeriod=30 Sep 29 19:03:05 crc kubenswrapper[4780]: I0929 19:03:05.297284 4780 generic.go:334] "Generic (PLEG): container finished" podID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerID="92ee612a44a7f3b4d4dd0cb202407c29f9ce68fd74231e477d2b55cc0f7f17b3" exitCode=0 Sep 29 19:03:05 crc kubenswrapper[4780]: I0929 19:03:05.297320 4780 generic.go:334] "Generic (PLEG): container finished" podID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerID="948eda3cd40dd68ede38309f69bb61d0543dcf5aeeef88cd6afa247d52193183" exitCode=2 Sep 29 19:03:05 crc kubenswrapper[4780]: I0929 19:03:05.297330 4780 generic.go:334] "Generic (PLEG): container finished" podID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerID="31227d4db0dfb6a05800c015c34bdd6d3daa014b0a5ba6b93b2fb45f5e76d418" exitCode=0 Sep 29 19:03:05 crc kubenswrapper[4780]: I0929 19:03:05.297344 4780 generic.go:334] "Generic (PLEG): container finished" podID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerID="b55f8e909343af09dab44da32dd6bc4d3ed0d35e7ee48a4beec5a40cd2637b03" exitCode=0 Sep 29 19:03:05 crc kubenswrapper[4780]: I0929 19:03:05.297367 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerDied","Data":"92ee612a44a7f3b4d4dd0cb202407c29f9ce68fd74231e477d2b55cc0f7f17b3"} Sep 29 19:03:05 crc kubenswrapper[4780]: I0929 19:03:05.297396 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerDied","Data":"948eda3cd40dd68ede38309f69bb61d0543dcf5aeeef88cd6afa247d52193183"} Sep 29 19:03:05 crc kubenswrapper[4780]: I0929 19:03:05.297409 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerDied","Data":"31227d4db0dfb6a05800c015c34bdd6d3daa014b0a5ba6b93b2fb45f5e76d418"} Sep 29 19:03:05 crc kubenswrapper[4780]: I0929 19:03:05.297421 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerDied","Data":"b55f8e909343af09dab44da32dd6bc4d3ed0d35e7ee48a4beec5a40cd2637b03"} Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.843307 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.972915 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-log-httpd\") pod \"58d17dc9-2c5c-4556-a642-c2de0881d879\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.973099 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-run-httpd\") pod \"58d17dc9-2c5c-4556-a642-c2de0881d879\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.973252 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-sg-core-conf-yaml\") pod \"58d17dc9-2c5c-4556-a642-c2de0881d879\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.973288 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-combined-ca-bundle\") pod \"58d17dc9-2c5c-4556-a642-c2de0881d879\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.973349 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-scripts\") pod \"58d17dc9-2c5c-4556-a642-c2de0881d879\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.973394 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-config-data\") pod \"58d17dc9-2c5c-4556-a642-c2de0881d879\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.973538 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdrg5\" (UniqueName: \"kubernetes.io/projected/58d17dc9-2c5c-4556-a642-c2de0881d879-kube-api-access-cdrg5\") pod \"58d17dc9-2c5c-4556-a642-c2de0881d879\" (UID: \"58d17dc9-2c5c-4556-a642-c2de0881d879\") " Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.973765 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "58d17dc9-2c5c-4556-a642-c2de0881d879" (UID: "58d17dc9-2c5c-4556-a642-c2de0881d879"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.973683 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "58d17dc9-2c5c-4556-a642-c2de0881d879" (UID: "58d17dc9-2c5c-4556-a642-c2de0881d879"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.974422 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.974458 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d17dc9-2c5c-4556-a642-c2de0881d879-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.978079 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d17dc9-2c5c-4556-a642-c2de0881d879-kube-api-access-cdrg5" (OuterVolumeSpecName: "kube-api-access-cdrg5") pod "58d17dc9-2c5c-4556-a642-c2de0881d879" (UID: "58d17dc9-2c5c-4556-a642-c2de0881d879"). InnerVolumeSpecName "kube-api-access-cdrg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:07 crc kubenswrapper[4780]: I0929 19:03:07.979316 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-scripts" (OuterVolumeSpecName: "scripts") pod "58d17dc9-2c5c-4556-a642-c2de0881d879" (UID: "58d17dc9-2c5c-4556-a642-c2de0881d879"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.004619 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "58d17dc9-2c5c-4556-a642-c2de0881d879" (UID: "58d17dc9-2c5c-4556-a642-c2de0881d879"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.072108 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58d17dc9-2c5c-4556-a642-c2de0881d879" (UID: "58d17dc9-2c5c-4556-a642-c2de0881d879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.076010 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.076037 4780 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.076064 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.076080 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdrg5\" (UniqueName: \"kubernetes.io/projected/58d17dc9-2c5c-4556-a642-c2de0881d879-kube-api-access-cdrg5\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.077388 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-config-data" (OuterVolumeSpecName: "config-data") pod "58d17dc9-2c5c-4556-a642-c2de0881d879" (UID: "58d17dc9-2c5c-4556-a642-c2de0881d879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.178209 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d17dc9-2c5c-4556-a642-c2de0881d879-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.325154 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-58hpf" event={"ID":"f2087ead-f015-40af-b172-6cf166a01cf6","Type":"ContainerStarted","Data":"fbbd56e2c35a1f0f783bb787a2c0d549b1f65db0f5817736155304a6429851de"} Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.327931 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d17dc9-2c5c-4556-a642-c2de0881d879","Type":"ContainerDied","Data":"c42d25a55df1cf87962c12d31f7a1eb1bfa9f663787c7f9005f8ee9e812c5054"} Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.327982 4780 scope.go:117] "RemoveContainer" containerID="92ee612a44a7f3b4d4dd0cb202407c29f9ce68fd74231e477d2b55cc0f7f17b3" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.327998 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.350731 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-58hpf" podStartSLOduration=2.214632789 podStartE2EDuration="9.350539201s" podCreationTimestamp="2025-09-29 19:02:59 +0000 UTC" firstStartedPulling="2025-09-29 19:03:00.414312138 +0000 UTC m=+1180.362610182" lastFinishedPulling="2025-09-29 19:03:07.55021855 +0000 UTC m=+1187.498516594" observedRunningTime="2025-09-29 19:03:08.344674855 +0000 UTC m=+1188.292972909" watchObservedRunningTime="2025-09-29 19:03:08.350539201 +0000 UTC m=+1188.298837265" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.366964 4780 scope.go:117] "RemoveContainer" containerID="948eda3cd40dd68ede38309f69bb61d0543dcf5aeeef88cd6afa247d52193183" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.373353 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.406298 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.420765 4780 scope.go:117] "RemoveContainer" containerID="31227d4db0dfb6a05800c015c34bdd6d3daa014b0a5ba6b93b2fb45f5e76d418" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.437156 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:08 crc kubenswrapper[4780]: E0929 19:03:08.440434 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="ceilometer-notification-agent" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.440502 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="ceilometer-notification-agent" Sep 29 19:03:08 crc kubenswrapper[4780]: E0929 19:03:08.440534 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="proxy-httpd" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.440543 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="proxy-httpd" Sep 29 19:03:08 crc kubenswrapper[4780]: E0929 19:03:08.440591 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="ceilometer-central-agent" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.440602 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="ceilometer-central-agent" Sep 29 19:03:08 crc kubenswrapper[4780]: E0929 19:03:08.440613 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="sg-core" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.440620 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="sg-core" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.441250 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="ceilometer-central-agent" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.441288 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="sg-core" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.441299 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="ceilometer-notification-agent" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.441325 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" containerName="proxy-httpd" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.446179 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.448924 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.448952 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.454243 4780 scope.go:117] "RemoveContainer" containerID="b55f8e909343af09dab44da32dd6bc4d3ed0d35e7ee48a4beec5a40cd2637b03" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.464140 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.584944 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-log-httpd\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.585023 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5khsc\" (UniqueName: \"kubernetes.io/projected/ce576755-26bf-411e-807f-0b8a71ce54ed-kube-api-access-5khsc\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.585118 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.585379 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-config-data\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.585607 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-scripts\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.585641 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.585757 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-run-httpd\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.687359 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-config-data\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.687477 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-scripts\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.687496 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.687547 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-run-httpd\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.687573 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-log-httpd\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.687618 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5khsc\" (UniqueName: \"kubernetes.io/projected/ce576755-26bf-411e-807f-0b8a71ce54ed-kube-api-access-5khsc\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.687673 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.689418 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-log-httpd\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.689626 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-run-httpd\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.694303 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.694602 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-scripts\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.695854 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.695910 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-config-data\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.708750 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5khsc\" (UniqueName: \"kubernetes.io/projected/ce576755-26bf-411e-807f-0b8a71ce54ed-kube-api-access-5khsc\") pod \"ceilometer-0\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " pod="openstack/ceilometer-0" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.769338 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d17dc9-2c5c-4556-a642-c2de0881d879" path="/var/lib/kubelet/pods/58d17dc9-2c5c-4556-a642-c2de0881d879/volumes" Sep 29 19:03:08 crc kubenswrapper[4780]: I0929 19:03:08.831203 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:03:09 crc kubenswrapper[4780]: W0929 19:03:09.309924 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce576755_26bf_411e_807f_0b8a71ce54ed.slice/crio-ae16f7c976f2301ed3cc2cf2c799de9f312da796d61954c04cd8bca8908bea3a WatchSource:0}: Error finding container ae16f7c976f2301ed3cc2cf2c799de9f312da796d61954c04cd8bca8908bea3a: Status 404 returned error can't find the container with id ae16f7c976f2301ed3cc2cf2c799de9f312da796d61954c04cd8bca8908bea3a Sep 29 19:03:09 crc kubenswrapper[4780]: I0929 19:03:09.312643 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:09 crc kubenswrapper[4780]: I0929 19:03:09.338509 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerStarted","Data":"ae16f7c976f2301ed3cc2cf2c799de9f312da796d61954c04cd8bca8908bea3a"} Sep 29 19:03:10 crc kubenswrapper[4780]: I0929 19:03:10.353128 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerStarted","Data":"cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be"} Sep 29 19:03:11 crc kubenswrapper[4780]: I0929 19:03:11.365775 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerStarted","Data":"a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae"} Sep 29 19:03:12 crc kubenswrapper[4780]: I0929 19:03:12.394626 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerStarted","Data":"abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016"} Sep 29 19:03:13 crc kubenswrapper[4780]: I0929 19:03:13.408313 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerStarted","Data":"879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2"} Sep 29 19:03:13 crc kubenswrapper[4780]: I0929 19:03:13.408662 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 19:03:13 crc kubenswrapper[4780]: I0929 19:03:13.455089 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.951927173 podStartE2EDuration="5.455041005s" podCreationTimestamp="2025-09-29 19:03:08 +0000 UTC" firstStartedPulling="2025-09-29 19:03:09.323364412 +0000 UTC m=+1189.271662456" lastFinishedPulling="2025-09-29 19:03:12.826478244 +0000 UTC m=+1192.774776288" observedRunningTime="2025-09-29 19:03:13.434309048 +0000 UTC m=+1193.382607102" watchObservedRunningTime="2025-09-29 19:03:13.455041005 +0000 UTC m=+1193.403339069" Sep 29 19:03:19 crc kubenswrapper[4780]: I0929 19:03:19.478847 4780 generic.go:334] "Generic (PLEG): container finished" podID="f2087ead-f015-40af-b172-6cf166a01cf6" containerID="fbbd56e2c35a1f0f783bb787a2c0d549b1f65db0f5817736155304a6429851de" exitCode=0 Sep 29 19:03:19 crc kubenswrapper[4780]: I0929 19:03:19.478931 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-58hpf" event={"ID":"f2087ead-f015-40af-b172-6cf166a01cf6","Type":"ContainerDied","Data":"fbbd56e2c35a1f0f783bb787a2c0d549b1f65db0f5817736155304a6429851de"} Sep 29 19:03:20 crc kubenswrapper[4780]: I0929 19:03:20.898870 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:03:20 crc kubenswrapper[4780]: I0929 19:03:20.944859 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-scripts\") pod \"f2087ead-f015-40af-b172-6cf166a01cf6\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " Sep 29 19:03:20 crc kubenswrapper[4780]: I0929 19:03:20.944999 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-combined-ca-bundle\") pod \"f2087ead-f015-40af-b172-6cf166a01cf6\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " Sep 29 19:03:20 crc kubenswrapper[4780]: I0929 19:03:20.945025 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75xd9\" (UniqueName: \"kubernetes.io/projected/f2087ead-f015-40af-b172-6cf166a01cf6-kube-api-access-75xd9\") pod \"f2087ead-f015-40af-b172-6cf166a01cf6\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " Sep 29 19:03:20 crc kubenswrapper[4780]: I0929 19:03:20.945234 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-config-data\") pod \"f2087ead-f015-40af-b172-6cf166a01cf6\" (UID: \"f2087ead-f015-40af-b172-6cf166a01cf6\") " Sep 29 19:03:20 crc kubenswrapper[4780]: I0929 19:03:20.952639 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-scripts" (OuterVolumeSpecName: "scripts") pod "f2087ead-f015-40af-b172-6cf166a01cf6" (UID: "f2087ead-f015-40af-b172-6cf166a01cf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:20 crc kubenswrapper[4780]: I0929 19:03:20.955291 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2087ead-f015-40af-b172-6cf166a01cf6-kube-api-access-75xd9" (OuterVolumeSpecName: "kube-api-access-75xd9") pod "f2087ead-f015-40af-b172-6cf166a01cf6" (UID: "f2087ead-f015-40af-b172-6cf166a01cf6"). InnerVolumeSpecName "kube-api-access-75xd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:20 crc kubenswrapper[4780]: I0929 19:03:20.976038 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-config-data" (OuterVolumeSpecName: "config-data") pod "f2087ead-f015-40af-b172-6cf166a01cf6" (UID: "f2087ead-f015-40af-b172-6cf166a01cf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:20 crc kubenswrapper[4780]: I0929 19:03:20.979247 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2087ead-f015-40af-b172-6cf166a01cf6" (UID: "f2087ead-f015-40af-b172-6cf166a01cf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.047186 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.047466 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75xd9\" (UniqueName: \"kubernetes.io/projected/f2087ead-f015-40af-b172-6cf166a01cf6-kube-api-access-75xd9\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.047531 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.047597 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2087ead-f015-40af-b172-6cf166a01cf6-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.519338 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-58hpf" event={"ID":"f2087ead-f015-40af-b172-6cf166a01cf6","Type":"ContainerDied","Data":"75ca96ebdbb93077e1c3c0d748c7e1ce69ea7e2f53b3f3e1a0443a1e5e9830fc"} Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.519438 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75ca96ebdbb93077e1c3c0d748c7e1ce69ea7e2f53b3f3e1a0443a1e5e9830fc" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.519549 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-58hpf" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.669338 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 19:03:21 crc kubenswrapper[4780]: E0929 19:03:21.669957 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2087ead-f015-40af-b172-6cf166a01cf6" containerName="nova-cell0-conductor-db-sync" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.669979 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2087ead-f015-40af-b172-6cf166a01cf6" containerName="nova-cell0-conductor-db-sync" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.670234 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2087ead-f015-40af-b172-6cf166a01cf6" containerName="nova-cell0-conductor-db-sync" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.671181 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.674620 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.675107 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tdc5j" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.688933 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.771117 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92w8q\" (UniqueName: \"kubernetes.io/projected/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-kube-api-access-92w8q\") pod \"nova-cell0-conductor-0\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.771194 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.771287 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.873677 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92w8q\" (UniqueName: \"kubernetes.io/projected/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-kube-api-access-92w8q\") pod \"nova-cell0-conductor-0\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.873762 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.873854 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.880815 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.880895 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:21 crc kubenswrapper[4780]: I0929 19:03:21.893229 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92w8q\" (UniqueName: \"kubernetes.io/projected/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-kube-api-access-92w8q\") pod \"nova-cell0-conductor-0\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:22 crc kubenswrapper[4780]: I0929 19:03:22.002371 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:22 crc kubenswrapper[4780]: I0929 19:03:22.449601 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 19:03:22 crc kubenswrapper[4780]: W0929 19:03:22.465339 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa6b4d2f_2f81_44fd_8c76_2aa6204209c3.slice/crio-e492c7360b9e5e03757464a55f0446e2e9be786e11bc83cef2cd125e0c612676 WatchSource:0}: Error finding container e492c7360b9e5e03757464a55f0446e2e9be786e11bc83cef2cd125e0c612676: Status 404 returned error can't find the container with id e492c7360b9e5e03757464a55f0446e2e9be786e11bc83cef2cd125e0c612676 Sep 29 19:03:22 crc kubenswrapper[4780]: I0929 19:03:22.531591 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3","Type":"ContainerStarted","Data":"e492c7360b9e5e03757464a55f0446e2e9be786e11bc83cef2cd125e0c612676"} Sep 29 19:03:23 crc kubenswrapper[4780]: I0929 19:03:23.548438 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3","Type":"ContainerStarted","Data":"739143154f41eccfb13a2b48adb19e687f9f167c8167b59c2ccf652c349ef90e"} Sep 29 19:03:23 crc kubenswrapper[4780]: I0929 19:03:23.548899 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.037266 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.059461 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=6.059439991 podStartE2EDuration="6.059439991s" podCreationTimestamp="2025-09-29 19:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:03:23.580350707 +0000 UTC m=+1203.528648771" watchObservedRunningTime="2025-09-29 19:03:27.059439991 +0000 UTC m=+1207.007738045" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.485653 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vg5kt"] Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.487266 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.490930 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.492986 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.532518 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vg5kt"] Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.603862 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-scripts\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.603947 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tg2f\" (UniqueName: \"kubernetes.io/projected/334ed004-9824-4be5-bf0c-027315c0bc82-kube-api-access-5tg2f\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.604013 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-config-data\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.604060 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.706006 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-scripts\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.706098 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tg2f\" (UniqueName: \"kubernetes.io/projected/334ed004-9824-4be5-bf0c-027315c0bc82-kube-api-access-5tg2f\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.706153 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-config-data\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.706188 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.706551 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.708663 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.714549 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.718789 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.742885 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-config-data\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.745688 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tg2f\" (UniqueName: \"kubernetes.io/projected/334ed004-9824-4be5-bf0c-027315c0bc82-kube-api-access-5tg2f\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.761594 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-scripts\") pod \"nova-cell0-cell-mapping-vg5kt\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.806125 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.810677 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-config-data\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.810751 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff590f2e-ec78-4136-a6db-c673f4f116e9-logs\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.810786 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87b7l\" (UniqueName: \"kubernetes.io/projected/ff590f2e-ec78-4136-a6db-c673f4f116e9-kube-api-access-87b7l\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.810893 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.812807 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.888128 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.889881 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.898713 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.911109 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.913915 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-config-data\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.914167 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.914191 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff590f2e-ec78-4136-a6db-c673f4f116e9-logs\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.914216 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-config-data\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.914235 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87b7l\" (UniqueName: \"kubernetes.io/projected/ff590f2e-ec78-4136-a6db-c673f4f116e9-kube-api-access-87b7l\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.923284 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-logs\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.923347 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdxvw\" (UniqueName: \"kubernetes.io/projected/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-kube-api-access-xdxvw\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.923443 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.925762 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff590f2e-ec78-4136-a6db-c673f4f116e9-logs\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.937898 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.937977 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.939385 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.946545 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.971766 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-config-data\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.985659 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:03:27 crc kubenswrapper[4780]: I0929 19:03:27.990299 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87b7l\" (UniqueName: \"kubernetes.io/projected/ff590f2e-ec78-4136-a6db-c673f4f116e9-kube-api-access-87b7l\") pod \"nova-api-0\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " pod="openstack/nova-api-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.023288 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.024831 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvw26\" (UniqueName: \"kubernetes.io/projected/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-kube-api-access-gvw26\") pod \"nova-scheduler-0\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.024985 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.025111 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-config-data\") pod \"nova-scheduler-0\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.025216 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.025301 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-config-data\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.025422 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-logs\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.025514 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdxvw\" (UniqueName: \"kubernetes.io/projected/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-kube-api-access-xdxvw\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.029517 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-logs\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.044072 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.060139 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-config-data\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.062673 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdxvw\" (UniqueName: \"kubernetes.io/projected/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-kube-api-access-xdxvw\") pod \"nova-metadata-0\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " pod="openstack/nova-metadata-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.069483 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.071298 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-ktp5p"] Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.075153 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.124098 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-ktp5p"] Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.132393 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.132445 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-config\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.132529 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-swift-storage-0\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.132560 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvw26\" (UniqueName: \"kubernetes.io/projected/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-kube-api-access-gvw26\") pod \"nova-scheduler-0\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.132612 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rflbq\" (UniqueName: \"kubernetes.io/projected/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-kube-api-access-rflbq\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.132636 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-svc\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.132660 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.132693 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.132715 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-config-data\") pod \"nova-scheduler-0\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.141617 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-config-data\") pod \"nova-scheduler-0\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.144875 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.156953 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.158575 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.163290 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.178421 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvw26\" (UniqueName: \"kubernetes.io/projected/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-kube-api-access-gvw26\") pod \"nova-scheduler-0\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.189743 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.234493 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-swift-storage-0\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.234565 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rflbq\" (UniqueName: \"kubernetes.io/projected/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-kube-api-access-rflbq\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.234591 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-svc\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.234626 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.234645 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.234667 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hl56\" (UniqueName: \"kubernetes.io/projected/b06bfdaa-ca44-4904-8f50-09196dd1b882-kube-api-access-4hl56\") pod \"nova-cell1-novncproxy-0\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.234719 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.234773 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.234803 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-config\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.235753 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-config\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.236420 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-swift-storage-0\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.237395 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-svc\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.238016 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.238676 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.266783 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rflbq\" (UniqueName: \"kubernetes.io/projected/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-kube-api-access-rflbq\") pod \"dnsmasq-dns-7d9cc4c77f-ktp5p\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.344755 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.344832 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hl56\" (UniqueName: \"kubernetes.io/projected/b06bfdaa-ca44-4904-8f50-09196dd1b882-kube-api-access-4hl56\") pod \"nova-cell1-novncproxy-0\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.344961 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.353811 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.360690 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.370687 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hl56\" (UniqueName: \"kubernetes.io/projected/b06bfdaa-ca44-4904-8f50-09196dd1b882-kube-api-access-4hl56\") pod \"nova-cell1-novncproxy-0\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.392630 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.405601 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.486263 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.567228 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vg5kt"] Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.646922 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vg5kt" event={"ID":"334ed004-9824-4be5-bf0c-027315c0bc82","Type":"ContainerStarted","Data":"7203fa96ff1ae0c56007d6c9a88f923492fc7a43c339e69dc917e7b136ce0772"} Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.705778 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.853549 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n48tz"] Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.856364 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.864116 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.865490 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.869808 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.881996 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n48tz"] Sep 29 19:03:28 crc kubenswrapper[4780]: W0929 19:03:28.905431 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff590f2e_ec78_4136_a6db_c673f4f116e9.slice/crio-ca964e7c157038564a6b990c967909569a52ab3c95d3a13f4da089bdf74140b6 WatchSource:0}: Error finding container ca964e7c157038564a6b990c967909569a52ab3c95d3a13f4da089bdf74140b6: Status 404 returned error can't find the container with id ca964e7c157038564a6b990c967909569a52ab3c95d3a13f4da089bdf74140b6 Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.975316 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.986081 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8nkh\" (UniqueName: \"kubernetes.io/projected/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-kube-api-access-b8nkh\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.986143 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-scripts\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.986178 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-config-data\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.986361 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:28 crc kubenswrapper[4780]: W0929 19:03:28.992739 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9dbc5a2_e2dc_4ab1_9957_c1120bc77767.slice/crio-9bb5901c535c70df8891ae9c196062a02c6bad9cb36f403439a2f9cfb71f9768 WatchSource:0}: Error finding container 9bb5901c535c70df8891ae9c196062a02c6bad9cb36f403439a2f9cfb71f9768: Status 404 returned error can't find the container with id 9bb5901c535c70df8891ae9c196062a02c6bad9cb36f403439a2f9cfb71f9768 Sep 29 19:03:28 crc kubenswrapper[4780]: I0929 19:03:28.993594 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-ktp5p"] Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.088784 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.088877 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8nkh\" (UniqueName: \"kubernetes.io/projected/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-kube-api-access-b8nkh\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.088914 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-scripts\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.088937 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-config-data\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.098711 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-scripts\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.098837 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.099246 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-config-data\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.110463 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8nkh\" (UniqueName: \"kubernetes.io/projected/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-kube-api-access-b8nkh\") pod \"nova-cell1-conductor-db-sync-n48tz\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.190471 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:03:29 crc kubenswrapper[4780]: W0929 19:03:29.191689 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb06bfdaa_ca44_4904_8f50_09196dd1b882.slice/crio-9fa09688fe94f02b3bb1c1368a264ea6ebc8c739a77d1317231ce81db61fb07a WatchSource:0}: Error finding container 9fa09688fe94f02b3bb1c1368a264ea6ebc8c739a77d1317231ce81db61fb07a: Status 404 returned error can't find the container with id 9fa09688fe94f02b3bb1c1368a264ea6ebc8c739a77d1317231ce81db61fb07a Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.264904 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.662705 4780 generic.go:334] "Generic (PLEG): container finished" podID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" containerID="7417c48c704e193717034fa2316dd912006f1d6915312181a0070cefc0aec660" exitCode=0 Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.662782 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" event={"ID":"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767","Type":"ContainerDied","Data":"7417c48c704e193717034fa2316dd912006f1d6915312181a0070cefc0aec660"} Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.663078 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" event={"ID":"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767","Type":"ContainerStarted","Data":"9bb5901c535c70df8891ae9c196062a02c6bad9cb36f403439a2f9cfb71f9768"} Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.666238 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b06bfdaa-ca44-4904-8f50-09196dd1b882","Type":"ContainerStarted","Data":"9fa09688fe94f02b3bb1c1368a264ea6ebc8c739a77d1317231ce81db61fb07a"} Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.668496 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vg5kt" event={"ID":"334ed004-9824-4be5-bf0c-027315c0bc82","Type":"ContainerStarted","Data":"60308fe91edf8a4076c678529c97807d46bf256eda44506a25d107905d15a376"} Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.670296 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff590f2e-ec78-4136-a6db-c673f4f116e9","Type":"ContainerStarted","Data":"ca964e7c157038564a6b990c967909569a52ab3c95d3a13f4da089bdf74140b6"} Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.671908 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8367b89-9ebe-4186-a9f5-1ffec1e10c42","Type":"ContainerStarted","Data":"93a87df348d59e386b255fe78af4957e7aeb6db040cb01bf781b3241091f07f0"} Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.673091 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b","Type":"ContainerStarted","Data":"e7a4c41c508e8e919d5304f1f9f56c808cc5b053867b0ae118ef8f20eb22c7b2"} Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.712955 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vg5kt" podStartSLOduration=2.712906857 podStartE2EDuration="2.712906857s" podCreationTimestamp="2025-09-29 19:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:03:29.707387631 +0000 UTC m=+1209.655685705" watchObservedRunningTime="2025-09-29 19:03:29.712906857 +0000 UTC m=+1209.661204901" Sep 29 19:03:29 crc kubenswrapper[4780]: I0929 19:03:29.779249 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n48tz"] Sep 29 19:03:30 crc kubenswrapper[4780]: I0929 19:03:30.688630 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n48tz" event={"ID":"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c","Type":"ContainerStarted","Data":"667ea135113b961c6b2a36ada8f212c39fb66bc12bbf320d1a2bbbed6a920a4c"} Sep 29 19:03:30 crc kubenswrapper[4780]: I0929 19:03:30.689157 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n48tz" event={"ID":"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c","Type":"ContainerStarted","Data":"882b2f70a74bdfda925c9f1353e2fd64bc8a3fcaaeea6b08e73fdb0b466f9eac"} Sep 29 19:03:30 crc kubenswrapper[4780]: I0929 19:03:30.693000 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" event={"ID":"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767","Type":"ContainerStarted","Data":"768f5d5cf8168347307b5b17a365e35ef6b005b15cbb0e0cec6144bacc00023d"} Sep 29 19:03:30 crc kubenswrapper[4780]: I0929 19:03:30.718443 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-n48tz" podStartSLOduration=2.718400392 podStartE2EDuration="2.718400392s" podCreationTimestamp="2025-09-29 19:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:03:30.703176991 +0000 UTC m=+1210.651475035" watchObservedRunningTime="2025-09-29 19:03:30.718400392 +0000 UTC m=+1210.666698426" Sep 29 19:03:30 crc kubenswrapper[4780]: I0929 19:03:30.733663 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" podStartSLOduration=3.733643253 podStartE2EDuration="3.733643253s" podCreationTimestamp="2025-09-29 19:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:03:30.725507613 +0000 UTC m=+1210.673805677" watchObservedRunningTime="2025-09-29 19:03:30.733643253 +0000 UTC m=+1210.681941297" Sep 29 19:03:31 crc kubenswrapper[4780]: I0929 19:03:31.707975 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:32 crc kubenswrapper[4780]: I0929 19:03:32.171844 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:03:32 crc kubenswrapper[4780]: I0929 19:03:32.183987 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.729576 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8367b89-9ebe-4186-a9f5-1ffec1e10c42","Type":"ContainerStarted","Data":"4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8"} Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.730443 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8367b89-9ebe-4186-a9f5-1ffec1e10c42","Type":"ContainerStarted","Data":"91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841"} Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.730121 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerName="nova-metadata-metadata" containerID="cri-o://4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8" gracePeriod=30 Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.729810 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerName="nova-metadata-log" containerID="cri-o://91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841" gracePeriod=30 Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.742975 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b","Type":"ContainerStarted","Data":"d5fa693f997be79209a59d1f63614dac3bd82606e2154ef19e8cba3ac9e956f3"} Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.752578 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b06bfdaa-ca44-4904-8f50-09196dd1b882","Type":"ContainerStarted","Data":"6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8"} Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.752651 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b06bfdaa-ca44-4904-8f50-09196dd1b882" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8" gracePeriod=30 Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.769480 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.894079052 podStartE2EDuration="6.769457895s" podCreationTimestamp="2025-09-29 19:03:27 +0000 UTC" firstStartedPulling="2025-09-29 19:03:28.758518587 +0000 UTC m=+1208.706816631" lastFinishedPulling="2025-09-29 19:03:32.63389742 +0000 UTC m=+1212.582195474" observedRunningTime="2025-09-29 19:03:33.754372608 +0000 UTC m=+1213.702670652" watchObservedRunningTime="2025-09-29 19:03:33.769457895 +0000 UTC m=+1213.717755939" Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.783170 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff590f2e-ec78-4136-a6db-c673f4f116e9","Type":"ContainerStarted","Data":"2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537"} Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.783253 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff590f2e-ec78-4136-a6db-c673f4f116e9","Type":"ContainerStarted","Data":"602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f"} Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.787015 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.1359814950000002 podStartE2EDuration="6.786993451s" podCreationTimestamp="2025-09-29 19:03:27 +0000 UTC" firstStartedPulling="2025-09-29 19:03:28.981355481 +0000 UTC m=+1208.929653525" lastFinishedPulling="2025-09-29 19:03:32.632367437 +0000 UTC m=+1212.580665481" observedRunningTime="2025-09-29 19:03:33.782068832 +0000 UTC m=+1213.730366876" watchObservedRunningTime="2025-09-29 19:03:33.786993451 +0000 UTC m=+1213.735291495" Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.805862 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.361988339 podStartE2EDuration="6.805831444s" podCreationTimestamp="2025-09-29 19:03:27 +0000 UTC" firstStartedPulling="2025-09-29 19:03:29.194018897 +0000 UTC m=+1209.142316941" lastFinishedPulling="2025-09-29 19:03:32.637862002 +0000 UTC m=+1212.586160046" observedRunningTime="2025-09-29 19:03:33.804139406 +0000 UTC m=+1213.752437450" watchObservedRunningTime="2025-09-29 19:03:33.805831444 +0000 UTC m=+1213.754129488" Sep 29 19:03:33 crc kubenswrapper[4780]: I0929 19:03:33.829919 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.112764489 podStartE2EDuration="6.829895885s" podCreationTimestamp="2025-09-29 19:03:27 +0000 UTC" firstStartedPulling="2025-09-29 19:03:28.9155854 +0000 UTC m=+1208.863883435" lastFinishedPulling="2025-09-29 19:03:32.632716787 +0000 UTC m=+1212.581014831" observedRunningTime="2025-09-29 19:03:33.820579441 +0000 UTC m=+1213.768877485" watchObservedRunningTime="2025-09-29 19:03:33.829895885 +0000 UTC m=+1213.778193929" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.365070 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.463540 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdxvw\" (UniqueName: \"kubernetes.io/projected/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-kube-api-access-xdxvw\") pod \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.463923 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-logs\") pod \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.463973 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-combined-ca-bundle\") pod \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.464034 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-config-data\") pod \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\" (UID: \"d8367b89-9ebe-4186-a9f5-1ffec1e10c42\") " Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.464195 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-logs" (OuterVolumeSpecName: "logs") pod "d8367b89-9ebe-4186-a9f5-1ffec1e10c42" (UID: "d8367b89-9ebe-4186-a9f5-1ffec1e10c42"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.464519 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.470966 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-kube-api-access-xdxvw" (OuterVolumeSpecName: "kube-api-access-xdxvw") pod "d8367b89-9ebe-4186-a9f5-1ffec1e10c42" (UID: "d8367b89-9ebe-4186-a9f5-1ffec1e10c42"). InnerVolumeSpecName "kube-api-access-xdxvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.492307 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8367b89-9ebe-4186-a9f5-1ffec1e10c42" (UID: "d8367b89-9ebe-4186-a9f5-1ffec1e10c42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.497161 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-config-data" (OuterVolumeSpecName: "config-data") pod "d8367b89-9ebe-4186-a9f5-1ffec1e10c42" (UID: "d8367b89-9ebe-4186-a9f5-1ffec1e10c42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.566773 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdxvw\" (UniqueName: \"kubernetes.io/projected/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-kube-api-access-xdxvw\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.566831 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.566846 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8367b89-9ebe-4186-a9f5-1ffec1e10c42-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.796074 4780 generic.go:334] "Generic (PLEG): container finished" podID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerID="4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8" exitCode=0 Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.796110 4780 generic.go:334] "Generic (PLEG): container finished" podID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerID="91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841" exitCode=143 Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.796205 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.796259 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8367b89-9ebe-4186-a9f5-1ffec1e10c42","Type":"ContainerDied","Data":"4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8"} Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.796294 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8367b89-9ebe-4186-a9f5-1ffec1e10c42","Type":"ContainerDied","Data":"91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841"} Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.796305 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8367b89-9ebe-4186-a9f5-1ffec1e10c42","Type":"ContainerDied","Data":"93a87df348d59e386b255fe78af4957e7aeb6db040cb01bf781b3241091f07f0"} Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.796320 4780 scope.go:117] "RemoveContainer" containerID="4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.830164 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.845137 4780 scope.go:117] "RemoveContainer" containerID="91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.853180 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.860846 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:34 crc kubenswrapper[4780]: E0929 19:03:34.861312 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerName="nova-metadata-metadata" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.861330 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerName="nova-metadata-metadata" Sep 29 19:03:34 crc kubenswrapper[4780]: E0929 19:03:34.861352 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerName="nova-metadata-log" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.861359 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerName="nova-metadata-log" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.861541 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerName="nova-metadata-log" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.861566 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" containerName="nova-metadata-metadata" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.863747 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.869977 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.870154 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.895035 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.898304 4780 scope.go:117] "RemoveContainer" containerID="4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8" Sep 29 19:03:34 crc kubenswrapper[4780]: E0929 19:03:34.899122 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8\": container with ID starting with 4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8 not found: ID does not exist" containerID="4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.899199 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8"} err="failed to get container status \"4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8\": rpc error: code = NotFound desc = could not find container \"4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8\": container with ID starting with 4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8 not found: ID does not exist" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.899239 4780 scope.go:117] "RemoveContainer" containerID="91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841" Sep 29 19:03:34 crc kubenswrapper[4780]: E0929 19:03:34.899879 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841\": container with ID starting with 91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841 not found: ID does not exist" containerID="91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.899908 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841"} err="failed to get container status \"91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841\": rpc error: code = NotFound desc = could not find container \"91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841\": container with ID starting with 91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841 not found: ID does not exist" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.899927 4780 scope.go:117] "RemoveContainer" containerID="4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.901929 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8"} err="failed to get container status \"4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8\": rpc error: code = NotFound desc = could not find container \"4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8\": container with ID starting with 4d4aa7e28f4593b029c466f0b6d3e6eff8d3ce76923d53319014ab03c132d8a8 not found: ID does not exist" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.901963 4780 scope.go:117] "RemoveContainer" containerID="91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.905473 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841"} err="failed to get container status \"91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841\": rpc error: code = NotFound desc = could not find container \"91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841\": container with ID starting with 91f46115d33bd900b24e0dd2165e3a400261fdf561c98cf3e41d061cbf9f7841 not found: ID does not exist" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.979684 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-logs\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.979750 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-config-data\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.979791 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.979848 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:34 crc kubenswrapper[4780]: I0929 19:03:34.979884 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw78w\" (UniqueName: \"kubernetes.io/projected/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-kube-api-access-tw78w\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.081376 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw78w\" (UniqueName: \"kubernetes.io/projected/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-kube-api-access-tw78w\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.081494 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-logs\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.081531 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-config-data\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.081571 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.081656 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.082510 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-logs\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.087296 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-config-data\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.087421 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.095749 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.106830 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw78w\" (UniqueName: \"kubernetes.io/projected/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-kube-api-access-tw78w\") pod \"nova-metadata-0\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.186608 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.710384 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:35 crc kubenswrapper[4780]: W0929 19:03:35.721441 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4a3ae30_b7ed_48d8_a04c_bdaa3822834c.slice/crio-22fe3201188dc3b2bf4483a40e031dab48e3435c83086accadfce016917f12d5 WatchSource:0}: Error finding container 22fe3201188dc3b2bf4483a40e031dab48e3435c83086accadfce016917f12d5: Status 404 returned error can't find the container with id 22fe3201188dc3b2bf4483a40e031dab48e3435c83086accadfce016917f12d5 Sep 29 19:03:35 crc kubenswrapper[4780]: I0929 19:03:35.808570 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c","Type":"ContainerStarted","Data":"22fe3201188dc3b2bf4483a40e031dab48e3435c83086accadfce016917f12d5"} Sep 29 19:03:36 crc kubenswrapper[4780]: I0929 19:03:36.765587 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8367b89-9ebe-4186-a9f5-1ffec1e10c42" path="/var/lib/kubelet/pods/d8367b89-9ebe-4186-a9f5-1ffec1e10c42/volumes" Sep 29 19:03:36 crc kubenswrapper[4780]: I0929 19:03:36.828080 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c","Type":"ContainerStarted","Data":"dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61"} Sep 29 19:03:36 crc kubenswrapper[4780]: I0929 19:03:36.828149 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c","Type":"ContainerStarted","Data":"5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73"} Sep 29 19:03:36 crc kubenswrapper[4780]: I0929 19:03:36.853830 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8538137199999998 podStartE2EDuration="2.85381372s" podCreationTimestamp="2025-09-29 19:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:03:36.850907638 +0000 UTC m=+1216.799205702" watchObservedRunningTime="2025-09-29 19:03:36.85381372 +0000 UTC m=+1216.802111764" Sep 29 19:03:37 crc kubenswrapper[4780]: I0929 19:03:37.856466 4780 generic.go:334] "Generic (PLEG): container finished" podID="334ed004-9824-4be5-bf0c-027315c0bc82" containerID="60308fe91edf8a4076c678529c97807d46bf256eda44506a25d107905d15a376" exitCode=0 Sep 29 19:03:37 crc kubenswrapper[4780]: I0929 19:03:37.856529 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vg5kt" event={"ID":"334ed004-9824-4be5-bf0c-027315c0bc82","Type":"ContainerDied","Data":"60308fe91edf8a4076c678529c97807d46bf256eda44506a25d107905d15a376"} Sep 29 19:03:37 crc kubenswrapper[4780]: I0929 19:03:37.861060 4780 generic.go:334] "Generic (PLEG): container finished" podID="d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c" containerID="667ea135113b961c6b2a36ada8f212c39fb66bc12bbf320d1a2bbbed6a920a4c" exitCode=0 Sep 29 19:03:37 crc kubenswrapper[4780]: I0929 19:03:37.861091 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n48tz" event={"ID":"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c","Type":"ContainerDied","Data":"667ea135113b961c6b2a36ada8f212c39fb66bc12bbf320d1a2bbbed6a920a4c"} Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.024067 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.024726 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.394132 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.394197 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.407332 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.426339 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.474225 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-6bb74"] Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.474591 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" podUID="b0e696ba-be42-4d7e-8208-d5ff25c7b61c" containerName="dnsmasq-dns" containerID="cri-o://084913df48747b008d8c649ad400b9a86279655d6e345198f5534ceb7c325bcd" gracePeriod=10 Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.487239 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.844600 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.902379 4780 generic.go:334] "Generic (PLEG): container finished" podID="b0e696ba-be42-4d7e-8208-d5ff25c7b61c" containerID="084913df48747b008d8c649ad400b9a86279655d6e345198f5534ceb7c325bcd" exitCode=0 Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.902448 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" event={"ID":"b0e696ba-be42-4d7e-8208-d5ff25c7b61c","Type":"ContainerDied","Data":"084913df48747b008d8c649ad400b9a86279655d6e345198f5534ceb7c325bcd"} Sep 29 19:03:38 crc kubenswrapper[4780]: I0929 19:03:38.966088 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.079293 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.109548 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.109652 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.207392 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-swift-storage-0\") pod \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.207618 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-svc\") pod \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.207784 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-nb\") pod \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.207859 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc2xr\" (UniqueName: \"kubernetes.io/projected/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-kube-api-access-pc2xr\") pod \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.207917 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-sb\") pod \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.208130 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-config\") pod \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\" (UID: \"b0e696ba-be42-4d7e-8208-d5ff25c7b61c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.233525 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-kube-api-access-pc2xr" (OuterVolumeSpecName: "kube-api-access-pc2xr") pod "b0e696ba-be42-4d7e-8208-d5ff25c7b61c" (UID: "b0e696ba-be42-4d7e-8208-d5ff25c7b61c"). InnerVolumeSpecName "kube-api-access-pc2xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.338393 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0e696ba-be42-4d7e-8208-d5ff25c7b61c" (UID: "b0e696ba-be42-4d7e-8208-d5ff25c7b61c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.341458 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b0e696ba-be42-4d7e-8208-d5ff25c7b61c" (UID: "b0e696ba-be42-4d7e-8208-d5ff25c7b61c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.348598 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b0e696ba-be42-4d7e-8208-d5ff25c7b61c" (UID: "b0e696ba-be42-4d7e-8208-d5ff25c7b61c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.353169 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.353200 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc2xr\" (UniqueName: \"kubernetes.io/projected/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-kube-api-access-pc2xr\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.353211 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.353219 4780 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.355672 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0e696ba-be42-4d7e-8208-d5ff25c7b61c" (UID: "b0e696ba-be42-4d7e-8208-d5ff25c7b61c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.380601 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.391438 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-config" (OuterVolumeSpecName: "config") pod "b0e696ba-be42-4d7e-8208-d5ff25c7b61c" (UID: "b0e696ba-be42-4d7e-8208-d5ff25c7b61c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.406032 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.456438 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.456509 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0e696ba-be42-4d7e-8208-d5ff25c7b61c-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.557459 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-combined-ca-bundle\") pod \"334ed004-9824-4be5-bf0c-027315c0bc82\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.557538 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-config-data\") pod \"334ed004-9824-4be5-bf0c-027315c0bc82\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.557685 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-config-data\") pod \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.557724 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-scripts\") pod \"334ed004-9824-4be5-bf0c-027315c0bc82\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.557800 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tg2f\" (UniqueName: \"kubernetes.io/projected/334ed004-9824-4be5-bf0c-027315c0bc82-kube-api-access-5tg2f\") pod \"334ed004-9824-4be5-bf0c-027315c0bc82\" (UID: \"334ed004-9824-4be5-bf0c-027315c0bc82\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.557845 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-combined-ca-bundle\") pod \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.557905 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8nkh\" (UniqueName: \"kubernetes.io/projected/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-kube-api-access-b8nkh\") pod \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.557970 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-scripts\") pod \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\" (UID: \"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c\") " Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.561976 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-scripts" (OuterVolumeSpecName: "scripts") pod "334ed004-9824-4be5-bf0c-027315c0bc82" (UID: "334ed004-9824-4be5-bf0c-027315c0bc82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.563028 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-scripts" (OuterVolumeSpecName: "scripts") pod "d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c" (UID: "d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.563608 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334ed004-9824-4be5-bf0c-027315c0bc82-kube-api-access-5tg2f" (OuterVolumeSpecName: "kube-api-access-5tg2f") pod "334ed004-9824-4be5-bf0c-027315c0bc82" (UID: "334ed004-9824-4be5-bf0c-027315c0bc82"). InnerVolumeSpecName "kube-api-access-5tg2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.564562 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-kube-api-access-b8nkh" (OuterVolumeSpecName: "kube-api-access-b8nkh") pod "d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c" (UID: "d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c"). InnerVolumeSpecName "kube-api-access-b8nkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.598808 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c" (UID: "d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.604361 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-config-data" (OuterVolumeSpecName: "config-data") pod "d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c" (UID: "d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.609738 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "334ed004-9824-4be5-bf0c-027315c0bc82" (UID: "334ed004-9824-4be5-bf0c-027315c0bc82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.617734 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-config-data" (OuterVolumeSpecName: "config-data") pod "334ed004-9824-4be5-bf0c-027315c0bc82" (UID: "334ed004-9824-4be5-bf0c-027315c0bc82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.660015 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.660055 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.660067 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tg2f\" (UniqueName: \"kubernetes.io/projected/334ed004-9824-4be5-bf0c-027315c0bc82-kube-api-access-5tg2f\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.660079 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.660089 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8nkh\" (UniqueName: \"kubernetes.io/projected/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-kube-api-access-b8nkh\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.660097 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.660107 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.660116 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334ed004-9824-4be5-bf0c-027315c0bc82-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.914734 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vg5kt" event={"ID":"334ed004-9824-4be5-bf0c-027315c0bc82","Type":"ContainerDied","Data":"7203fa96ff1ae0c56007d6c9a88f923492fc7a43c339e69dc917e7b136ce0772"} Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.914774 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7203fa96ff1ae0c56007d6c9a88f923492fc7a43c339e69dc917e7b136ce0772" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.914836 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vg5kt" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.921506 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" event={"ID":"b0e696ba-be42-4d7e-8208-d5ff25c7b61c","Type":"ContainerDied","Data":"590a1d68e37b70764ad99348cd066670a58cf392cf76038d3b197042c2734fd9"} Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.921522 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c47bb5d77-6bb74" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.924454 4780 scope.go:117] "RemoveContainer" containerID="084913df48747b008d8c649ad400b9a86279655d6e345198f5534ceb7c325bcd" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.925882 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n48tz" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.926243 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n48tz" event={"ID":"d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c","Type":"ContainerDied","Data":"882b2f70a74bdfda925c9f1353e2fd64bc8a3fcaaeea6b08e73fdb0b466f9eac"} Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.926274 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="882b2f70a74bdfda925c9f1353e2fd64bc8a3fcaaeea6b08e73fdb0b466f9eac" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.949919 4780 scope.go:117] "RemoveContainer" containerID="6fa93223af11d911ba4d498a61be2d6f0fac14b0ddb7b2edb56c6124c4413c45" Sep 29 19:03:39 crc kubenswrapper[4780]: I0929 19:03:39.997676 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-6bb74"] Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.026239 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c47bb5d77-6bb74"] Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.041122 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 19:03:40 crc kubenswrapper[4780]: E0929 19:03:40.041615 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334ed004-9824-4be5-bf0c-027315c0bc82" containerName="nova-manage" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.041633 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="334ed004-9824-4be5-bf0c-027315c0bc82" containerName="nova-manage" Sep 29 19:03:40 crc kubenswrapper[4780]: E0929 19:03:40.041661 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c" containerName="nova-cell1-conductor-db-sync" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.041668 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c" containerName="nova-cell1-conductor-db-sync" Sep 29 19:03:40 crc kubenswrapper[4780]: E0929 19:03:40.041689 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e696ba-be42-4d7e-8208-d5ff25c7b61c" containerName="dnsmasq-dns" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.041697 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e696ba-be42-4d7e-8208-d5ff25c7b61c" containerName="dnsmasq-dns" Sep 29 19:03:40 crc kubenswrapper[4780]: E0929 19:03:40.041716 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e696ba-be42-4d7e-8208-d5ff25c7b61c" containerName="init" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.041722 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e696ba-be42-4d7e-8208-d5ff25c7b61c" containerName="init" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.041907 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c" containerName="nova-cell1-conductor-db-sync" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.041944 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="334ed004-9824-4be5-bf0c-027315c0bc82" containerName="nova-manage" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.041958 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e696ba-be42-4d7e-8208-d5ff25c7b61c" containerName="dnsmasq-dns" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.042784 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.045297 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.059854 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.072555 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghlcg\" (UniqueName: \"kubernetes.io/projected/bc401926-3969-448c-9910-22572fecb168-kube-api-access-ghlcg\") pod \"nova-cell1-conductor-0\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.080874 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.081171 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.103663 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.103899 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-log" containerID="cri-o://602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f" gracePeriod=30 Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.104383 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-api" containerID="cri-o://2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537" gracePeriod=30 Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.136162 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.174276 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.174530 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerName="nova-metadata-log" containerID="cri-o://5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73" gracePeriod=30 Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.175526 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerName="nova-metadata-metadata" containerID="cri-o://dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61" gracePeriod=30 Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.184441 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghlcg\" (UniqueName: \"kubernetes.io/projected/bc401926-3969-448c-9910-22572fecb168-kube-api-access-ghlcg\") pod \"nova-cell1-conductor-0\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.184596 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.184642 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.189167 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.189255 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.191953 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.195904 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.201825 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghlcg\" (UniqueName: \"kubernetes.io/projected/bc401926-3969-448c-9910-22572fecb168-kube-api-access-ghlcg\") pod \"nova-cell1-conductor-0\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.371818 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.763288 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0e696ba-be42-4d7e-8208-d5ff25c7b61c" path="/var/lib/kubelet/pods/b0e696ba-be42-4d7e-8208-d5ff25c7b61c/volumes" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.826519 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.900649 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.903119 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-logs\") pod \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.903182 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-combined-ca-bundle\") pod \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.903289 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw78w\" (UniqueName: \"kubernetes.io/projected/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-kube-api-access-tw78w\") pod \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.903388 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-nova-metadata-tls-certs\") pod \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.903663 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-config-data\") pod \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\" (UID: \"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c\") " Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.903842 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-logs" (OuterVolumeSpecName: "logs") pod "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" (UID: "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.904486 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:40 crc kubenswrapper[4780]: W0929 19:03:40.912712 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc401926_3969_448c_9910_22572fecb168.slice/crio-c0b10ce8993705e4f4545cd5c2072ccf52ecd829746e872de9b0de3f1ef9e502 WatchSource:0}: Error finding container c0b10ce8993705e4f4545cd5c2072ccf52ecd829746e872de9b0de3f1ef9e502: Status 404 returned error can't find the container with id c0b10ce8993705e4f4545cd5c2072ccf52ecd829746e872de9b0de3f1ef9e502 Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.921429 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-kube-api-access-tw78w" (OuterVolumeSpecName: "kube-api-access-tw78w") pod "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" (UID: "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c"). InnerVolumeSpecName "kube-api-access-tw78w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.946380 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bc401926-3969-448c-9910-22572fecb168","Type":"ContainerStarted","Data":"c0b10ce8993705e4f4545cd5c2072ccf52ecd829746e872de9b0de3f1ef9e502"} Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.949070 4780 generic.go:334] "Generic (PLEG): container finished" podID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerID="602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f" exitCode=143 Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.949117 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff590f2e-ec78-4136-a6db-c673f4f116e9","Type":"ContainerDied","Data":"602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f"} Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.955374 4780 generic.go:334] "Generic (PLEG): container finished" podID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerID="dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61" exitCode=0 Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.955399 4780 generic.go:334] "Generic (PLEG): container finished" podID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerID="5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73" exitCode=143 Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.955562 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" containerName="nova-scheduler-scheduler" containerID="cri-o://d5fa693f997be79209a59d1f63614dac3bd82606e2154ef19e8cba3ac9e956f3" gracePeriod=30 Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.955705 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.956034 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c","Type":"ContainerDied","Data":"dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61"} Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.956077 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c","Type":"ContainerDied","Data":"5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73"} Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.956089 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a4a3ae30-b7ed-48d8-a04c-bdaa3822834c","Type":"ContainerDied","Data":"22fe3201188dc3b2bf4483a40e031dab48e3435c83086accadfce016917f12d5"} Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.956103 4780 scope.go:117] "RemoveContainer" containerID="dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.973465 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" (UID: "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.992712 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-config-data" (OuterVolumeSpecName: "config-data") pod "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" (UID: "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:40 crc kubenswrapper[4780]: I0929 19:03:40.999300 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" (UID: "a4a3ae30-b7ed-48d8-a04c-bdaa3822834c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.006241 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.006284 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.006296 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw78w\" (UniqueName: \"kubernetes.io/projected/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-kube-api-access-tw78w\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.006305 4780 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.049289 4780 scope.go:117] "RemoveContainer" containerID="5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.090245 4780 scope.go:117] "RemoveContainer" containerID="dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61" Sep 29 19:03:41 crc kubenswrapper[4780]: E0929 19:03:41.090755 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61\": container with ID starting with dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61 not found: ID does not exist" containerID="dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.090810 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61"} err="failed to get container status \"dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61\": rpc error: code = NotFound desc = could not find container \"dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61\": container with ID starting with dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61 not found: ID does not exist" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.090846 4780 scope.go:117] "RemoveContainer" containerID="5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73" Sep 29 19:03:41 crc kubenswrapper[4780]: E0929 19:03:41.091281 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73\": container with ID starting with 5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73 not found: ID does not exist" containerID="5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.091319 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73"} err="failed to get container status \"5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73\": rpc error: code = NotFound desc = could not find container \"5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73\": container with ID starting with 5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73 not found: ID does not exist" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.091341 4780 scope.go:117] "RemoveContainer" containerID="dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.091555 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61"} err="failed to get container status \"dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61\": rpc error: code = NotFound desc = could not find container \"dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61\": container with ID starting with dc2fb88ebaf5e68d830c81e43fe2cb2d36d87330d9a387114074e25f893a0a61 not found: ID does not exist" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.091576 4780 scope.go:117] "RemoveContainer" containerID="5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.092114 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73"} err="failed to get container status \"5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73\": rpc error: code = NotFound desc = could not find container \"5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73\": container with ID starting with 5aa568ab5c8f5d563866415ec97495f6416f221a37bedead72f61ee206cb6a73 not found: ID does not exist" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.328133 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.339273 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.351404 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:41 crc kubenswrapper[4780]: E0929 19:03:41.351861 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerName="nova-metadata-log" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.351883 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerName="nova-metadata-log" Sep 29 19:03:41 crc kubenswrapper[4780]: E0929 19:03:41.351925 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerName="nova-metadata-metadata" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.351934 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerName="nova-metadata-metadata" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.352189 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerName="nova-metadata-log" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.352209 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" containerName="nova-metadata-metadata" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.353341 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.357194 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.361151 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.366345 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.412465 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-config-data\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.412530 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12185efa-dd2b-4c18-ab7d-f05c1f123c30-logs\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.412619 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.412653 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27jxv\" (UniqueName: \"kubernetes.io/projected/12185efa-dd2b-4c18-ab7d-f05c1f123c30-kube-api-access-27jxv\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.412690 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: E0929 19:03:41.436309 4780 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4a3ae30_b7ed_48d8_a04c_bdaa3822834c.slice\": RecentStats: unable to find data in memory cache]" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.514346 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.514909 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27jxv\" (UniqueName: \"kubernetes.io/projected/12185efa-dd2b-4c18-ab7d-f05c1f123c30-kube-api-access-27jxv\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.514955 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.515074 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-config-data\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.515104 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12185efa-dd2b-4c18-ab7d-f05c1f123c30-logs\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.515544 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12185efa-dd2b-4c18-ab7d-f05c1f123c30-logs\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.519815 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-config-data\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.524893 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.524910 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.537783 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27jxv\" (UniqueName: \"kubernetes.io/projected/12185efa-dd2b-4c18-ab7d-f05c1f123c30-kube-api-access-27jxv\") pod \"nova-metadata-0\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.688998 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.967987 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bc401926-3969-448c-9910-22572fecb168","Type":"ContainerStarted","Data":"1bf3800786032f687dfb373cbc1d24ace1919441397847f347217bf7a840db61"} Sep 29 19:03:41 crc kubenswrapper[4780]: I0929 19:03:41.968750 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:42 crc kubenswrapper[4780]: I0929 19:03:41.999655 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.999631823 podStartE2EDuration="2.999631823s" podCreationTimestamp="2025-09-29 19:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:03:41.987769108 +0000 UTC m=+1221.936067152" watchObservedRunningTime="2025-09-29 19:03:41.999631823 +0000 UTC m=+1221.947929867" Sep 29 19:03:42 crc kubenswrapper[4780]: I0929 19:03:42.214430 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:03:42 crc kubenswrapper[4780]: I0929 19:03:42.788695 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a3ae30-b7ed-48d8-a04c-bdaa3822834c" path="/var/lib/kubelet/pods/a4a3ae30-b7ed-48d8-a04c-bdaa3822834c/volumes" Sep 29 19:03:42 crc kubenswrapper[4780]: I0929 19:03:42.988227 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12185efa-dd2b-4c18-ab7d-f05c1f123c30","Type":"ContainerStarted","Data":"1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d"} Sep 29 19:03:42 crc kubenswrapper[4780]: I0929 19:03:42.988697 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12185efa-dd2b-4c18-ab7d-f05c1f123c30","Type":"ContainerStarted","Data":"59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e"} Sep 29 19:03:42 crc kubenswrapper[4780]: I0929 19:03:42.988713 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12185efa-dd2b-4c18-ab7d-f05c1f123c30","Type":"ContainerStarted","Data":"939601ac92c249e98e467279ef01778165cedbe0b5b64213ff6e7c308789ba98"} Sep 29 19:03:43 crc kubenswrapper[4780]: I0929 19:03:43.020206 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.020182205 podStartE2EDuration="2.020182205s" podCreationTimestamp="2025-09-29 19:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:03:43.009633256 +0000 UTC m=+1222.957931400" watchObservedRunningTime="2025-09-29 19:03:43.020182205 +0000 UTC m=+1222.968480249" Sep 29 19:03:43 crc kubenswrapper[4780]: E0929 19:03:43.397613 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5fa693f997be79209a59d1f63614dac3bd82606e2154ef19e8cba3ac9e956f3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 19:03:43 crc kubenswrapper[4780]: E0929 19:03:43.399808 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5fa693f997be79209a59d1f63614dac3bd82606e2154ef19e8cba3ac9e956f3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 19:03:43 crc kubenswrapper[4780]: E0929 19:03:43.401802 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5fa693f997be79209a59d1f63614dac3bd82606e2154ef19e8cba3ac9e956f3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 19:03:43 crc kubenswrapper[4780]: E0929 19:03:43.401877 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" containerName="nova-scheduler-scheduler" Sep 29 19:03:43 crc kubenswrapper[4780]: I0929 19:03:43.975720 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 19:03:43 crc kubenswrapper[4780]: I0929 19:03:43.976008 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ad923606-d2ba-467f-b983-c8e77c9f6cc3" containerName="kube-state-metrics" containerID="cri-o://592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a" gracePeriod=30 Sep 29 19:03:44 crc kubenswrapper[4780]: I0929 19:03:44.488162 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 19:03:44 crc kubenswrapper[4780]: I0929 19:03:44.583437 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k5gl\" (UniqueName: \"kubernetes.io/projected/ad923606-d2ba-467f-b983-c8e77c9f6cc3-kube-api-access-8k5gl\") pod \"ad923606-d2ba-467f-b983-c8e77c9f6cc3\" (UID: \"ad923606-d2ba-467f-b983-c8e77c9f6cc3\") " Sep 29 19:03:44 crc kubenswrapper[4780]: I0929 19:03:44.589813 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad923606-d2ba-467f-b983-c8e77c9f6cc3-kube-api-access-8k5gl" (OuterVolumeSpecName: "kube-api-access-8k5gl") pod "ad923606-d2ba-467f-b983-c8e77c9f6cc3" (UID: "ad923606-d2ba-467f-b983-c8e77c9f6cc3"). InnerVolumeSpecName "kube-api-access-8k5gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:44 crc kubenswrapper[4780]: I0929 19:03:44.687339 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k5gl\" (UniqueName: \"kubernetes.io/projected/ad923606-d2ba-467f-b983-c8e77c9f6cc3-kube-api-access-8k5gl\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.016110 4780 generic.go:334] "Generic (PLEG): container finished" podID="ad923606-d2ba-467f-b983-c8e77c9f6cc3" containerID="592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a" exitCode=2 Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.016203 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.016206 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad923606-d2ba-467f-b983-c8e77c9f6cc3","Type":"ContainerDied","Data":"592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a"} Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.016959 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad923606-d2ba-467f-b983-c8e77c9f6cc3","Type":"ContainerDied","Data":"d2a560e1a3f6c105554d98ae9125dbd34074025eee08355f7f8826c66eb4b4a0"} Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.016992 4780 scope.go:117] "RemoveContainer" containerID="592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.024575 4780 generic.go:334] "Generic (PLEG): container finished" podID="7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" containerID="d5fa693f997be79209a59d1f63614dac3bd82606e2154ef19e8cba3ac9e956f3" exitCode=0 Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.024634 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b","Type":"ContainerDied","Data":"d5fa693f997be79209a59d1f63614dac3bd82606e2154ef19e8cba3ac9e956f3"} Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.054197 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.067774 4780 scope.go:117] "RemoveContainer" containerID="592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a" Sep 29 19:03:45 crc kubenswrapper[4780]: E0929 19:03:45.068226 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a\": container with ID starting with 592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a not found: ID does not exist" containerID="592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.068262 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a"} err="failed to get container status \"592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a\": rpc error: code = NotFound desc = could not find container \"592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a\": container with ID starting with 592ec6e34ff482c342439f7f1f1f8d4a89fca674aa91dfa28a292b556e35dd5a not found: ID does not exist" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.076090 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.085445 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 19:03:45 crc kubenswrapper[4780]: E0929 19:03:45.085883 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad923606-d2ba-467f-b983-c8e77c9f6cc3" containerName="kube-state-metrics" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.085902 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad923606-d2ba-467f-b983-c8e77c9f6cc3" containerName="kube-state-metrics" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.086133 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad923606-d2ba-467f-b983-c8e77c9f6cc3" containerName="kube-state-metrics" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.086802 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.089727 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.089822 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.108564 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.208840 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.209197 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.209385 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d524t\" (UniqueName: \"kubernetes.io/projected/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-api-access-d524t\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.209635 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.232778 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.310970 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvw26\" (UniqueName: \"kubernetes.io/projected/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-kube-api-access-gvw26\") pod \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.311160 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-combined-ca-bundle\") pod \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.311339 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-config-data\") pod \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\" (UID: \"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b\") " Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.311709 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.311770 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.311841 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d524t\" (UniqueName: \"kubernetes.io/projected/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-api-access-d524t\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.311886 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.316994 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.317379 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-kube-api-access-gvw26" (OuterVolumeSpecName: "kube-api-access-gvw26") pod "7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" (UID: "7bef5c9a-b96b-4cf3-b959-c8dd9de7227b"). InnerVolumeSpecName "kube-api-access-gvw26". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.322723 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.328419 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.329229 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d524t\" (UniqueName: \"kubernetes.io/projected/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-api-access-d524t\") pod \"kube-state-metrics-0\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.342343 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-config-data" (OuterVolumeSpecName: "config-data") pod "7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" (UID: "7bef5c9a-b96b-4cf3-b959-c8dd9de7227b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.344022 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" (UID: "7bef5c9a-b96b-4cf3-b959-c8dd9de7227b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.406355 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.414574 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.414615 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvw26\" (UniqueName: \"kubernetes.io/projected/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-kube-api-access-gvw26\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.414631 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:45 crc kubenswrapper[4780]: W0929 19:03:45.889838 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded88e38f_cb35_4072_8f9f_1c6ab980ec03.slice/crio-a5efc05caf3f3f05d91017d190edd659da437fd2cdab923cd83902f41b32ce6d WatchSource:0}: Error finding container a5efc05caf3f3f05d91017d190edd659da437fd2cdab923cd83902f41b32ce6d: Status 404 returned error can't find the container with id a5efc05caf3f3f05d91017d190edd659da437fd2cdab923cd83902f41b32ce6d Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.890269 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 19:03:45 crc kubenswrapper[4780]: I0929 19:03:45.936519 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.028421 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.029172 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="ceilometer-central-agent" containerID="cri-o://cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be" gracePeriod=30 Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.029656 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="proxy-httpd" containerID="cri-o://879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2" gracePeriod=30 Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.029731 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="sg-core" containerID="cri-o://abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016" gracePeriod=30 Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.029765 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="ceilometer-notification-agent" containerID="cri-o://a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae" gracePeriod=30 Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.031965 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff590f2e-ec78-4136-a6db-c673f4f116e9-logs\") pod \"ff590f2e-ec78-4136-a6db-c673f4f116e9\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.032108 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87b7l\" (UniqueName: \"kubernetes.io/projected/ff590f2e-ec78-4136-a6db-c673f4f116e9-kube-api-access-87b7l\") pod \"ff590f2e-ec78-4136-a6db-c673f4f116e9\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.032342 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-combined-ca-bundle\") pod \"ff590f2e-ec78-4136-a6db-c673f4f116e9\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.032501 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-config-data\") pod \"ff590f2e-ec78-4136-a6db-c673f4f116e9\" (UID: \"ff590f2e-ec78-4136-a6db-c673f4f116e9\") " Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.032927 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff590f2e-ec78-4136-a6db-c673f4f116e9-logs" (OuterVolumeSpecName: "logs") pod "ff590f2e-ec78-4136-a6db-c673f4f116e9" (UID: "ff590f2e-ec78-4136-a6db-c673f4f116e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.036329 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff590f2e-ec78-4136-a6db-c673f4f116e9-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.043623 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7bef5c9a-b96b-4cf3-b959-c8dd9de7227b","Type":"ContainerDied","Data":"e7a4c41c508e8e919d5304f1f9f56c808cc5b053867b0ae118ef8f20eb22c7b2"} Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.043710 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.043777 4780 scope.go:117] "RemoveContainer" containerID="d5fa693f997be79209a59d1f63614dac3bd82606e2154ef19e8cba3ac9e956f3" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.044815 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff590f2e-ec78-4136-a6db-c673f4f116e9-kube-api-access-87b7l" (OuterVolumeSpecName: "kube-api-access-87b7l") pod "ff590f2e-ec78-4136-a6db-c673f4f116e9" (UID: "ff590f2e-ec78-4136-a6db-c673f4f116e9"). InnerVolumeSpecName "kube-api-access-87b7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.054677 4780 generic.go:334] "Generic (PLEG): container finished" podID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerID="2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537" exitCode=0 Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.054755 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff590f2e-ec78-4136-a6db-c673f4f116e9","Type":"ContainerDied","Data":"2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537"} Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.054787 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff590f2e-ec78-4136-a6db-c673f4f116e9","Type":"ContainerDied","Data":"ca964e7c157038564a6b990c967909569a52ab3c95d3a13f4da089bdf74140b6"} Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.054856 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.059364 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ed88e38f-cb35-4072-8f9f-1c6ab980ec03","Type":"ContainerStarted","Data":"a5efc05caf3f3f05d91017d190edd659da437fd2cdab923cd83902f41b32ce6d"} Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.075065 4780 scope.go:117] "RemoveContainer" containerID="2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.078542 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff590f2e-ec78-4136-a6db-c673f4f116e9" (UID: "ff590f2e-ec78-4136-a6db-c673f4f116e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.084476 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.090074 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-config-data" (OuterVolumeSpecName: "config-data") pod "ff590f2e-ec78-4136-a6db-c673f4f116e9" (UID: "ff590f2e-ec78-4136-a6db-c673f4f116e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.106811 4780 scope.go:117] "RemoveContainer" containerID="602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.106972 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.115913 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:03:46 crc kubenswrapper[4780]: E0929 19:03:46.116358 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-log" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.116378 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-log" Sep 29 19:03:46 crc kubenswrapper[4780]: E0929 19:03:46.116419 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-api" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.116427 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-api" Sep 29 19:03:46 crc kubenswrapper[4780]: E0929 19:03:46.116440 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" containerName="nova-scheduler-scheduler" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.116446 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" containerName="nova-scheduler-scheduler" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.116638 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" containerName="nova-scheduler-scheduler" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.116656 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-log" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.116691 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" containerName="nova-api-api" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.117537 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.120654 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.124123 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.138303 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.138348 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff590f2e-ec78-4136-a6db-c673f4f116e9-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.138365 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87b7l\" (UniqueName: \"kubernetes.io/projected/ff590f2e-ec78-4136-a6db-c673f4f116e9-kube-api-access-87b7l\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.145584 4780 scope.go:117] "RemoveContainer" containerID="2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537" Sep 29 19:03:46 crc kubenswrapper[4780]: E0929 19:03:46.147513 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537\": container with ID starting with 2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537 not found: ID does not exist" containerID="2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.147554 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537"} err="failed to get container status \"2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537\": rpc error: code = NotFound desc = could not find container \"2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537\": container with ID starting with 2a5b03319c7d76aa14d70dd9d729fa043829f5d4f5e2eb1700a4d8b838eb1537 not found: ID does not exist" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.147581 4780 scope.go:117] "RemoveContainer" containerID="602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f" Sep 29 19:03:46 crc kubenswrapper[4780]: E0929 19:03:46.157516 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f\": container with ID starting with 602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f not found: ID does not exist" containerID="602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.157596 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f"} err="failed to get container status \"602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f\": rpc error: code = NotFound desc = could not find container \"602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f\": container with ID starting with 602c768bf144e7d30c4da3441691cb1351da7809208061283346547895a4bf0f not found: ID does not exist" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.240453 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjch9\" (UniqueName: \"kubernetes.io/projected/b382cd38-acb9-4516-b2f6-ce8fc385752b-kube-api-access-fjch9\") pod \"nova-scheduler-0\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.240581 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.240609 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-config-data\") pod \"nova-scheduler-0\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.342861 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.342911 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-config-data\") pod \"nova-scheduler-0\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.343014 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjch9\" (UniqueName: \"kubernetes.io/projected/b382cd38-acb9-4516-b2f6-ce8fc385752b-kube-api-access-fjch9\") pod \"nova-scheduler-0\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.348300 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.348836 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-config-data\") pod \"nova-scheduler-0\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.363545 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjch9\" (UniqueName: \"kubernetes.io/projected/b382cd38-acb9-4516-b2f6-ce8fc385752b-kube-api-access-fjch9\") pod \"nova-scheduler-0\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.447634 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.466159 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.476362 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.478771 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.481760 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.487865 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.548665 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-logs\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.548863 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v6vg\" (UniqueName: \"kubernetes.io/projected/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-kube-api-access-8v6vg\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.548911 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.548967 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-config-data\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.609655 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.650664 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-config-data\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.650771 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-logs\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.650858 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v6vg\" (UniqueName: \"kubernetes.io/projected/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-kube-api-access-8v6vg\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.650893 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.651847 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-logs\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.659968 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.662370 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-config-data\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.673537 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v6vg\" (UniqueName: \"kubernetes.io/projected/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-kube-api-access-8v6vg\") pod \"nova-api-0\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " pod="openstack/nova-api-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.690196 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.690265 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.771988 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bef5c9a-b96b-4cf3-b959-c8dd9de7227b" path="/var/lib/kubelet/pods/7bef5c9a-b96b-4cf3-b959-c8dd9de7227b/volumes" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.774619 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad923606-d2ba-467f-b983-c8e77c9f6cc3" path="/var/lib/kubelet/pods/ad923606-d2ba-467f-b983-c8e77c9f6cc3/volumes" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.777841 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff590f2e-ec78-4136-a6db-c673f4f116e9" path="/var/lib/kubelet/pods/ff590f2e-ec78-4136-a6db-c673f4f116e9/volumes" Sep 29 19:03:46 crc kubenswrapper[4780]: I0929 19:03:46.824589 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.089360 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.096215 4780 generic.go:334] "Generic (PLEG): container finished" podID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerID="879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2" exitCode=0 Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.096255 4780 generic.go:334] "Generic (PLEG): container finished" podID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerID="abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016" exitCode=2 Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.096263 4780 generic.go:334] "Generic (PLEG): container finished" podID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerID="cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be" exitCode=0 Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.096299 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerDied","Data":"879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2"} Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.096404 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerDied","Data":"abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016"} Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.096423 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerDied","Data":"cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be"} Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.106728 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ed88e38f-cb35-4072-8f9f-1c6ab980ec03","Type":"ContainerStarted","Data":"64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a"} Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.106809 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.136014 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.7287973989999998 podStartE2EDuration="2.135993549s" podCreationTimestamp="2025-09-29 19:03:45 +0000 UTC" firstStartedPulling="2025-09-29 19:03:45.892715587 +0000 UTC m=+1225.841013631" lastFinishedPulling="2025-09-29 19:03:46.299911747 +0000 UTC m=+1226.248209781" observedRunningTime="2025-09-29 19:03:47.128454086 +0000 UTC m=+1227.076752150" watchObservedRunningTime="2025-09-29 19:03:47.135993549 +0000 UTC m=+1227.084291593" Sep 29 19:03:47 crc kubenswrapper[4780]: W0929 19:03:47.388346 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09ddeea8_91ea_464e_a810_85bcbc6c5cb2.slice/crio-582f2c1b650e41d201611f97116a133c43cf564b50fbe5e5373ac170a3528b9e WatchSource:0}: Error finding container 582f2c1b650e41d201611f97116a133c43cf564b50fbe5e5373ac170a3528b9e: Status 404 returned error can't find the container with id 582f2c1b650e41d201611f97116a133c43cf564b50fbe5e5373ac170a3528b9e Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.389876 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.646777 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.775321 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-run-httpd\") pod \"ce576755-26bf-411e-807f-0b8a71ce54ed\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.775429 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5khsc\" (UniqueName: \"kubernetes.io/projected/ce576755-26bf-411e-807f-0b8a71ce54ed-kube-api-access-5khsc\") pod \"ce576755-26bf-411e-807f-0b8a71ce54ed\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.775477 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-scripts\") pod \"ce576755-26bf-411e-807f-0b8a71ce54ed\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.775784 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-combined-ca-bundle\") pod \"ce576755-26bf-411e-807f-0b8a71ce54ed\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.776016 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-sg-core-conf-yaml\") pod \"ce576755-26bf-411e-807f-0b8a71ce54ed\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.776067 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-log-httpd\") pod \"ce576755-26bf-411e-807f-0b8a71ce54ed\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.776096 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-config-data\") pod \"ce576755-26bf-411e-807f-0b8a71ce54ed\" (UID: \"ce576755-26bf-411e-807f-0b8a71ce54ed\") " Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.777713 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ce576755-26bf-411e-807f-0b8a71ce54ed" (UID: "ce576755-26bf-411e-807f-0b8a71ce54ed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.777824 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ce576755-26bf-411e-807f-0b8a71ce54ed" (UID: "ce576755-26bf-411e-807f-0b8a71ce54ed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.781869 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.781895 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce576755-26bf-411e-807f-0b8a71ce54ed-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.784695 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-scripts" (OuterVolumeSpecName: "scripts") pod "ce576755-26bf-411e-807f-0b8a71ce54ed" (UID: "ce576755-26bf-411e-807f-0b8a71ce54ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.785688 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce576755-26bf-411e-807f-0b8a71ce54ed-kube-api-access-5khsc" (OuterVolumeSpecName: "kube-api-access-5khsc") pod "ce576755-26bf-411e-807f-0b8a71ce54ed" (UID: "ce576755-26bf-411e-807f-0b8a71ce54ed"). InnerVolumeSpecName "kube-api-access-5khsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.822034 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ce576755-26bf-411e-807f-0b8a71ce54ed" (UID: "ce576755-26bf-411e-807f-0b8a71ce54ed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.880351 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce576755-26bf-411e-807f-0b8a71ce54ed" (UID: "ce576755-26bf-411e-807f-0b8a71ce54ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.884269 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.884303 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.884319 4780 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.884331 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5khsc\" (UniqueName: \"kubernetes.io/projected/ce576755-26bf-411e-807f-0b8a71ce54ed-kube-api-access-5khsc\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.923291 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-config-data" (OuterVolumeSpecName: "config-data") pod "ce576755-26bf-411e-807f-0b8a71ce54ed" (UID: "ce576755-26bf-411e-807f-0b8a71ce54ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:03:47 crc kubenswrapper[4780]: I0929 19:03:47.986641 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce576755-26bf-411e-807f-0b8a71ce54ed-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.125914 4780 generic.go:334] "Generic (PLEG): container finished" podID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerID="a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae" exitCode=0 Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.125980 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerDied","Data":"a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae"} Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.126084 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce576755-26bf-411e-807f-0b8a71ce54ed","Type":"ContainerDied","Data":"ae16f7c976f2301ed3cc2cf2c799de9f312da796d61954c04cd8bca8908bea3a"} Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.126088 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.126110 4780 scope.go:117] "RemoveContainer" containerID="879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.131440 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09ddeea8-91ea-464e-a810-85bcbc6c5cb2","Type":"ContainerStarted","Data":"760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5"} Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.131498 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09ddeea8-91ea-464e-a810-85bcbc6c5cb2","Type":"ContainerStarted","Data":"9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9"} Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.131511 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09ddeea8-91ea-464e-a810-85bcbc6c5cb2","Type":"ContainerStarted","Data":"582f2c1b650e41d201611f97116a133c43cf564b50fbe5e5373ac170a3528b9e"} Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.136607 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b382cd38-acb9-4516-b2f6-ce8fc385752b","Type":"ContainerStarted","Data":"671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b"} Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.136669 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b382cd38-acb9-4516-b2f6-ce8fc385752b","Type":"ContainerStarted","Data":"0c184071c3daec77177a497f3a60c5c94aa9bfec21755d0795a884b1b1bb93d0"} Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.161997 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.161969984 podStartE2EDuration="2.161969984s" podCreationTimestamp="2025-09-29 19:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:03:48.15686734 +0000 UTC m=+1228.105165384" watchObservedRunningTime="2025-09-29 19:03:48.161969984 +0000 UTC m=+1228.110268038" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.164263 4780 scope.go:117] "RemoveContainer" containerID="abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.191384 4780 scope.go:117] "RemoveContainer" containerID="a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.191699 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.228241 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.246403 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:48 crc kubenswrapper[4780]: E0929 19:03:48.247018 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="ceilometer-central-agent" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.247063 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="ceilometer-central-agent" Sep 29 19:03:48 crc kubenswrapper[4780]: E0929 19:03:48.247100 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="proxy-httpd" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.247111 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="proxy-httpd" Sep 29 19:03:48 crc kubenswrapper[4780]: E0929 19:03:48.247158 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="sg-core" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.247167 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="sg-core" Sep 29 19:03:48 crc kubenswrapper[4780]: E0929 19:03:48.247340 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="ceilometer-notification-agent" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.247356 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="ceilometer-notification-agent" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.247616 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="ceilometer-central-agent" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.247640 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="sg-core" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.247656 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="ceilometer-notification-agent" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.247674 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" containerName="proxy-httpd" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.250176 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.250147059 podStartE2EDuration="2.250147059s" podCreationTimestamp="2025-09-29 19:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:03:48.218515614 +0000 UTC m=+1228.166813658" watchObservedRunningTime="2025-09-29 19:03:48.250147059 +0000 UTC m=+1228.198445103" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.255938 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.268225 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.268642 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.269119 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.275719 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.295941 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-config-data\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.296102 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.296212 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.296295 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-log-httpd\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.296362 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-scripts\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.296433 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-run-httpd\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.296559 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.296631 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7kr\" (UniqueName: \"kubernetes.io/projected/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-kube-api-access-xc7kr\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.307412 4780 scope.go:117] "RemoveContainer" containerID="cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.329718 4780 scope.go:117] "RemoveContainer" containerID="879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2" Sep 29 19:03:48 crc kubenswrapper[4780]: E0929 19:03:48.332534 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2\": container with ID starting with 879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2 not found: ID does not exist" containerID="879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.332582 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2"} err="failed to get container status \"879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2\": rpc error: code = NotFound desc = could not find container \"879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2\": container with ID starting with 879ffb62f78dcbb98f9f41816d7f9601be3c397dc6696067f95ff40062520fa2 not found: ID does not exist" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.332614 4780 scope.go:117] "RemoveContainer" containerID="abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016" Sep 29 19:03:48 crc kubenswrapper[4780]: E0929 19:03:48.333071 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016\": container with ID starting with abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016 not found: ID does not exist" containerID="abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.333136 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016"} err="failed to get container status \"abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016\": rpc error: code = NotFound desc = could not find container \"abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016\": container with ID starting with abbaab7f0f98ef0642a1c0f984b5cb6f0d556582ced65cf773206210b0fb6016 not found: ID does not exist" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.333160 4780 scope.go:117] "RemoveContainer" containerID="a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae" Sep 29 19:03:48 crc kubenswrapper[4780]: E0929 19:03:48.333435 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae\": container with ID starting with a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae not found: ID does not exist" containerID="a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.333464 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae"} err="failed to get container status \"a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae\": rpc error: code = NotFound desc = could not find container \"a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae\": container with ID starting with a86408c754d80a3e5f3c989c1608c5f89efd497879cc69ddbfe48443d66ffcae not found: ID does not exist" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.333482 4780 scope.go:117] "RemoveContainer" containerID="cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be" Sep 29 19:03:48 crc kubenswrapper[4780]: E0929 19:03:48.333776 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be\": container with ID starting with cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be not found: ID does not exist" containerID="cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.333805 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be"} err="failed to get container status \"cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be\": rpc error: code = NotFound desc = could not find container \"cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be\": container with ID starting with cd323c34a529223fcb791d5c91e60330f2fb7b4b7625ad3809fb235b424166be not found: ID does not exist" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.398436 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.398498 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-log-httpd\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.398518 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-scripts\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.398543 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-run-httpd\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.398621 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.398643 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7kr\" (UniqueName: \"kubernetes.io/projected/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-kube-api-access-xc7kr\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.398690 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-config-data\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.398709 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.399559 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-run-httpd\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.399645 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-log-httpd\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.405034 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-scripts\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.405452 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.405561 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.405770 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-config-data\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.408312 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.427597 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7kr\" (UniqueName: \"kubernetes.io/projected/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-kube-api-access-xc7kr\") pod \"ceilometer-0\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.602550 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:03:48 crc kubenswrapper[4780]: I0929 19:03:48.777431 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce576755-26bf-411e-807f-0b8a71ce54ed" path="/var/lib/kubelet/pods/ce576755-26bf-411e-807f-0b8a71ce54ed/volumes" Sep 29 19:03:49 crc kubenswrapper[4780]: I0929 19:03:49.105809 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:03:49 crc kubenswrapper[4780]: W0929 19:03:49.110980 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23f2aacb_3391_4cc9_83b2_11fe35ff1ad5.slice/crio-a8dc6fe03be656e3c4c4d2cca0a612099fa6c8d1c67b089c2c54c675ac46b403 WatchSource:0}: Error finding container a8dc6fe03be656e3c4c4d2cca0a612099fa6c8d1c67b089c2c54c675ac46b403: Status 404 returned error can't find the container with id a8dc6fe03be656e3c4c4d2cca0a612099fa6c8d1c67b089c2c54c675ac46b403 Sep 29 19:03:49 crc kubenswrapper[4780]: I0929 19:03:49.152912 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerStarted","Data":"a8dc6fe03be656e3c4c4d2cca0a612099fa6c8d1c67b089c2c54c675ac46b403"} Sep 29 19:03:49 crc kubenswrapper[4780]: I0929 19:03:49.324132 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="ad923606-d2ba-467f-b983-c8e77c9f6cc3" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 19:03:50 crc kubenswrapper[4780]: I0929 19:03:50.164616 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerStarted","Data":"d9dbbeb8ced815a83416f8e09c2e323da204a3572113df9346bead6a2c34c96e"} Sep 29 19:03:50 crc kubenswrapper[4780]: I0929 19:03:50.409546 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 29 19:03:51 crc kubenswrapper[4780]: I0929 19:03:51.178270 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerStarted","Data":"819824c2b2a688db07e49736ffe21d6a74d3cf926ef2693684565633ee086e4a"} Sep 29 19:03:51 crc kubenswrapper[4780]: I0929 19:03:51.609798 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 29 19:03:51 crc kubenswrapper[4780]: I0929 19:03:51.690261 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 19:03:51 crc kubenswrapper[4780]: I0929 19:03:51.690332 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 19:03:52 crc kubenswrapper[4780]: I0929 19:03:52.189719 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerStarted","Data":"3ea44cfdf9919a76fca0a878560fc51879bf80e9fba9c6a98a763678bf48b29d"} Sep 29 19:03:52 crc kubenswrapper[4780]: I0929 19:03:52.704200 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 19:03:52 crc kubenswrapper[4780]: I0929 19:03:52.704264 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 19:03:54 crc kubenswrapper[4780]: I0929 19:03:54.211075 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerStarted","Data":"80f8694640d989cc46e19c687d484487a7410d2594910ad7ee4d5d2fecd42757"} Sep 29 19:03:54 crc kubenswrapper[4780]: I0929 19:03:54.213341 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 19:03:54 crc kubenswrapper[4780]: I0929 19:03:54.251028 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.153708913 podStartE2EDuration="6.251003593s" podCreationTimestamp="2025-09-29 19:03:48 +0000 UTC" firstStartedPulling="2025-09-29 19:03:49.113593656 +0000 UTC m=+1229.061891700" lastFinishedPulling="2025-09-29 19:03:53.210888326 +0000 UTC m=+1233.159186380" observedRunningTime="2025-09-29 19:03:54.247927943 +0000 UTC m=+1234.196225977" watchObservedRunningTime="2025-09-29 19:03:54.251003593 +0000 UTC m=+1234.199301637" Sep 29 19:03:55 crc kubenswrapper[4780]: I0929 19:03:55.422181 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 29 19:03:56 crc kubenswrapper[4780]: I0929 19:03:56.610123 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 29 19:03:56 crc kubenswrapper[4780]: I0929 19:03:56.641730 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 29 19:03:56 crc kubenswrapper[4780]: I0929 19:03:56.824838 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 19:03:56 crc kubenswrapper[4780]: I0929 19:03:56.824904 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 19:03:57 crc kubenswrapper[4780]: I0929 19:03:57.275482 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 29 19:03:57 crc kubenswrapper[4780]: I0929 19:03:57.906287 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 19:03:57 crc kubenswrapper[4780]: I0929 19:03:57.906370 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 29 19:04:01 crc kubenswrapper[4780]: I0929 19:04:01.696074 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 19:04:01 crc kubenswrapper[4780]: I0929 19:04:01.696788 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 19:04:01 crc kubenswrapper[4780]: I0929 19:04:01.701695 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 19:04:01 crc kubenswrapper[4780]: I0929 19:04:01.702771 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 19:04:03 crc kubenswrapper[4780]: I0929 19:04:03.224117 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:04:03 crc kubenswrapper[4780]: I0929 19:04:03.224494 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.215998 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.315392 4780 generic.go:334] "Generic (PLEG): container finished" podID="b06bfdaa-ca44-4904-8f50-09196dd1b882" containerID="6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8" exitCode=137 Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.315443 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b06bfdaa-ca44-4904-8f50-09196dd1b882","Type":"ContainerDied","Data":"6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8"} Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.315472 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b06bfdaa-ca44-4904-8f50-09196dd1b882","Type":"ContainerDied","Data":"9fa09688fe94f02b3bb1c1368a264ea6ebc8c739a77d1317231ce81db61fb07a"} Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.315477 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.315490 4780 scope.go:117] "RemoveContainer" containerID="6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.338837 4780 scope.go:117] "RemoveContainer" containerID="6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8" Sep 29 19:04:04 crc kubenswrapper[4780]: E0929 19:04:04.339559 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8\": container with ID starting with 6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8 not found: ID does not exist" containerID="6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.339635 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8"} err="failed to get container status \"6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8\": rpc error: code = NotFound desc = could not find container \"6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8\": container with ID starting with 6430b4c7a72b24228e29c09766189e52cabcea4a0cd8c5d76396d06ac665c0f8 not found: ID does not exist" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.364151 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-combined-ca-bundle\") pod \"b06bfdaa-ca44-4904-8f50-09196dd1b882\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.364259 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hl56\" (UniqueName: \"kubernetes.io/projected/b06bfdaa-ca44-4904-8f50-09196dd1b882-kube-api-access-4hl56\") pod \"b06bfdaa-ca44-4904-8f50-09196dd1b882\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.364431 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-config-data\") pod \"b06bfdaa-ca44-4904-8f50-09196dd1b882\" (UID: \"b06bfdaa-ca44-4904-8f50-09196dd1b882\") " Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.373175 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b06bfdaa-ca44-4904-8f50-09196dd1b882-kube-api-access-4hl56" (OuterVolumeSpecName: "kube-api-access-4hl56") pod "b06bfdaa-ca44-4904-8f50-09196dd1b882" (UID: "b06bfdaa-ca44-4904-8f50-09196dd1b882"). InnerVolumeSpecName "kube-api-access-4hl56". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.399803 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b06bfdaa-ca44-4904-8f50-09196dd1b882" (UID: "b06bfdaa-ca44-4904-8f50-09196dd1b882"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.402799 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-config-data" (OuterVolumeSpecName: "config-data") pod "b06bfdaa-ca44-4904-8f50-09196dd1b882" (UID: "b06bfdaa-ca44-4904-8f50-09196dd1b882"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.466686 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.466738 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hl56\" (UniqueName: \"kubernetes.io/projected/b06bfdaa-ca44-4904-8f50-09196dd1b882-kube-api-access-4hl56\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.466756 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06bfdaa-ca44-4904-8f50-09196dd1b882-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.657597 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.679217 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.695360 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:04:04 crc kubenswrapper[4780]: E0929 19:04:04.696012 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06bfdaa-ca44-4904-8f50-09196dd1b882" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.696043 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06bfdaa-ca44-4904-8f50-09196dd1b882" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.696347 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b06bfdaa-ca44-4904-8f50-09196dd1b882" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.697301 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.700917 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.701608 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.702705 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.710785 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.764988 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b06bfdaa-ca44-4904-8f50-09196dd1b882" path="/var/lib/kubelet/pods/b06bfdaa-ca44-4904-8f50-09196dd1b882/volumes" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.775951 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.776158 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.776351 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxnn2\" (UniqueName: \"kubernetes.io/projected/a1f2aaf8-27dc-428c-a387-d63424889230-kube-api-access-dxnn2\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.776559 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.776603 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.879546 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxnn2\" (UniqueName: \"kubernetes.io/projected/a1f2aaf8-27dc-428c-a387-d63424889230-kube-api-access-dxnn2\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.879642 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.879664 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.879718 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.879799 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.889372 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.889425 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.889444 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.889372 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:04 crc kubenswrapper[4780]: I0929 19:04:04.902547 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxnn2\" (UniqueName: \"kubernetes.io/projected/a1f2aaf8-27dc-428c-a387-d63424889230-kube-api-access-dxnn2\") pod \"nova-cell1-novncproxy-0\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:05 crc kubenswrapper[4780]: I0929 19:04:05.025704 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:06 crc kubenswrapper[4780]: I0929 19:04:06.045586 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:04:06 crc kubenswrapper[4780]: W0929 19:04:06.055487 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1f2aaf8_27dc_428c_a387_d63424889230.slice/crio-d5fb7650f7269fd4333da7d14849b4fa6a9bb6ef66af8f767aca3727b13d9a68 WatchSource:0}: Error finding container d5fb7650f7269fd4333da7d14849b4fa6a9bb6ef66af8f767aca3727b13d9a68: Status 404 returned error can't find the container with id d5fb7650f7269fd4333da7d14849b4fa6a9bb6ef66af8f767aca3727b13d9a68 Sep 29 19:04:06 crc kubenswrapper[4780]: I0929 19:04:06.339900 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a1f2aaf8-27dc-428c-a387-d63424889230","Type":"ContainerStarted","Data":"4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383"} Sep 29 19:04:06 crc kubenswrapper[4780]: I0929 19:04:06.340253 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a1f2aaf8-27dc-428c-a387-d63424889230","Type":"ContainerStarted","Data":"d5fb7650f7269fd4333da7d14849b4fa6a9bb6ef66af8f767aca3727b13d9a68"} Sep 29 19:04:06 crc kubenswrapper[4780]: I0929 19:04:06.370326 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.370304494 podStartE2EDuration="2.370304494s" podCreationTimestamp="2025-09-29 19:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:04:06.361626099 +0000 UTC m=+1246.309924143" watchObservedRunningTime="2025-09-29 19:04:06.370304494 +0000 UTC m=+1246.318602538" Sep 29 19:04:06 crc kubenswrapper[4780]: I0929 19:04:06.833049 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 19:04:06 crc kubenswrapper[4780]: I0929 19:04:06.833574 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 19:04:06 crc kubenswrapper[4780]: I0929 19:04:06.834732 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 19:04:06 crc kubenswrapper[4780]: I0929 19:04:06.846726 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.350010 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.354213 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.591008 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-jhslf"] Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.592771 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.611222 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-jhslf"] Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.761974 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-svc\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.762077 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-swift-storage-0\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.762185 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt6kq\" (UniqueName: \"kubernetes.io/projected/7373591d-cf39-4674-8b37-449096f6a3b6-kube-api-access-tt6kq\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.762257 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-config\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.762280 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-sb\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.762350 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-nb\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.864611 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-nb\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.864711 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-svc\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.864746 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-swift-storage-0\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.864796 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt6kq\" (UniqueName: \"kubernetes.io/projected/7373591d-cf39-4674-8b37-449096f6a3b6-kube-api-access-tt6kq\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.864876 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-config\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.864893 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-sb\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.865738 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-nb\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.865780 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-sb\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.866143 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-svc\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.866173 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-swift-storage-0\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.867057 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-config\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.904802 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt6kq\" (UniqueName: \"kubernetes.io/projected/7373591d-cf39-4674-8b37-449096f6a3b6-kube-api-access-tt6kq\") pod \"dnsmasq-dns-cc449b9dc-jhslf\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:07 crc kubenswrapper[4780]: I0929 19:04:07.928485 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:08 crc kubenswrapper[4780]: I0929 19:04:08.456641 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-jhslf"] Sep 29 19:04:09 crc kubenswrapper[4780]: I0929 19:04:09.370346 4780 generic.go:334] "Generic (PLEG): container finished" podID="7373591d-cf39-4674-8b37-449096f6a3b6" containerID="190508865a59a0e7a42d9038f11a9b7f87924fddbc543951d72a888ccb98fb52" exitCode=0 Sep 29 19:04:09 crc kubenswrapper[4780]: I0929 19:04:09.370409 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" event={"ID":"7373591d-cf39-4674-8b37-449096f6a3b6","Type":"ContainerDied","Data":"190508865a59a0e7a42d9038f11a9b7f87924fddbc543951d72a888ccb98fb52"} Sep 29 19:04:09 crc kubenswrapper[4780]: I0929 19:04:09.370780 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" event={"ID":"7373591d-cf39-4674-8b37-449096f6a3b6","Type":"ContainerStarted","Data":"a83ca5c714827d12697ce7c6aaefa49e88866902ac8d9a6be22ad403eab3e532"} Sep 29 19:04:09 crc kubenswrapper[4780]: I0929 19:04:09.788238 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:04:09 crc kubenswrapper[4780]: I0929 19:04:09.788953 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="ceilometer-central-agent" containerID="cri-o://d9dbbeb8ced815a83416f8e09c2e323da204a3572113df9346bead6a2c34c96e" gracePeriod=30 Sep 29 19:04:09 crc kubenswrapper[4780]: I0929 19:04:09.789121 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="proxy-httpd" containerID="cri-o://80f8694640d989cc46e19c687d484487a7410d2594910ad7ee4d5d2fecd42757" gracePeriod=30 Sep 29 19:04:09 crc kubenswrapper[4780]: I0929 19:04:09.789162 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="sg-core" containerID="cri-o://3ea44cfdf9919a76fca0a878560fc51879bf80e9fba9c6a98a763678bf48b29d" gracePeriod=30 Sep 29 19:04:09 crc kubenswrapper[4780]: I0929 19:04:09.789193 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="ceilometer-notification-agent" containerID="cri-o://819824c2b2a688db07e49736ffe21d6a74d3cf926ef2693684565633ee086e4a" gracePeriod=30 Sep 29 19:04:09 crc kubenswrapper[4780]: I0929 19:04:09.813412 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.196:3000/\": EOF" Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.027497 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.297132 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.386341 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" event={"ID":"7373591d-cf39-4674-8b37-449096f6a3b6","Type":"ContainerStarted","Data":"285ff896c42d36d4f725e2aad71a176bed9a283b3fe4f54c742f19da6fd34e81"} Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.386430 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.389967 4780 generic.go:334] "Generic (PLEG): container finished" podID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerID="80f8694640d989cc46e19c687d484487a7410d2594910ad7ee4d5d2fecd42757" exitCode=0 Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.390011 4780 generic.go:334] "Generic (PLEG): container finished" podID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerID="3ea44cfdf9919a76fca0a878560fc51879bf80e9fba9c6a98a763678bf48b29d" exitCode=2 Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.390025 4780 generic.go:334] "Generic (PLEG): container finished" podID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerID="d9dbbeb8ced815a83416f8e09c2e323da204a3572113df9346bead6a2c34c96e" exitCode=0 Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.390126 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerDied","Data":"80f8694640d989cc46e19c687d484487a7410d2594910ad7ee4d5d2fecd42757"} Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.390158 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerDied","Data":"3ea44cfdf9919a76fca0a878560fc51879bf80e9fba9c6a98a763678bf48b29d"} Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.390172 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerDied","Data":"d9dbbeb8ced815a83416f8e09c2e323da204a3572113df9346bead6a2c34c96e"} Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.390254 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-log" containerID="cri-o://9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9" gracePeriod=30 Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.390277 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-api" containerID="cri-o://760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5" gracePeriod=30 Sep 29 19:04:10 crc kubenswrapper[4780]: I0929 19:04:10.413880 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" podStartSLOduration=3.413861355 podStartE2EDuration="3.413861355s" podCreationTimestamp="2025-09-29 19:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:04:10.403353547 +0000 UTC m=+1250.351651591" watchObservedRunningTime="2025-09-29 19:04:10.413861355 +0000 UTC m=+1250.362159389" Sep 29 19:04:11 crc kubenswrapper[4780]: I0929 19:04:11.401878 4780 generic.go:334] "Generic (PLEG): container finished" podID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerID="9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9" exitCode=143 Sep 29 19:04:11 crc kubenswrapper[4780]: I0929 19:04:11.402958 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09ddeea8-91ea-464e-a810-85bcbc6c5cb2","Type":"ContainerDied","Data":"9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9"} Sep 29 19:04:12 crc kubenswrapper[4780]: E0929 19:04:12.358323 4780 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23f2aacb_3391_4cc9_83b2_11fe35ff1ad5.slice/crio-819824c2b2a688db07e49736ffe21d6a74d3cf926ef2693684565633ee086e4a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23f2aacb_3391_4cc9_83b2_11fe35ff1ad5.slice/crio-conmon-819824c2b2a688db07e49736ffe21d6a74d3cf926ef2693684565633ee086e4a.scope\": RecentStats: unable to find data in memory cache]" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.431168 4780 generic.go:334] "Generic (PLEG): container finished" podID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerID="819824c2b2a688db07e49736ffe21d6a74d3cf926ef2693684565633ee086e4a" exitCode=0 Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.431712 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerDied","Data":"819824c2b2a688db07e49736ffe21d6a74d3cf926ef2693684565633ee086e4a"} Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.587501 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.601691 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-run-httpd\") pod \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.602596 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" (UID: "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.703024 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-log-httpd\") pod \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.703120 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-ceilometer-tls-certs\") pod \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.703271 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-config-data\") pod \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.703350 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7kr\" (UniqueName: \"kubernetes.io/projected/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-kube-api-access-xc7kr\") pod \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.703378 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-sg-core-conf-yaml\") pod \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.703403 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-combined-ca-bundle\") pod \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.703454 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-scripts\") pod \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\" (UID: \"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5\") " Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.704439 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.705122 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" (UID: "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.711358 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-scripts" (OuterVolumeSpecName: "scripts") pod "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" (UID: "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.711963 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-kube-api-access-xc7kr" (OuterVolumeSpecName: "kube-api-access-xc7kr") pod "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" (UID: "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5"). InnerVolumeSpecName "kube-api-access-xc7kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.739997 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" (UID: "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.771245 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" (UID: "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.809439 4780 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.809484 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.809497 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.809508 4780 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.809521 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7kr\" (UniqueName: \"kubernetes.io/projected/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-kube-api-access-xc7kr\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.817352 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" (UID: "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.834155 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-config-data" (OuterVolumeSpecName: "config-data") pod "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" (UID: "23f2aacb-3391-4cc9-83b2-11fe35ff1ad5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.912699 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:12 crc kubenswrapper[4780]: I0929 19:04:12.912737 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.443666 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23f2aacb-3391-4cc9-83b2-11fe35ff1ad5","Type":"ContainerDied","Data":"a8dc6fe03be656e3c4c4d2cca0a612099fa6c8d1c67b089c2c54c675ac46b403"} Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.444091 4780 scope.go:117] "RemoveContainer" containerID="80f8694640d989cc46e19c687d484487a7410d2594910ad7ee4d5d2fecd42757" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.443756 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.469760 4780 scope.go:117] "RemoveContainer" containerID="3ea44cfdf9919a76fca0a878560fc51879bf80e9fba9c6a98a763678bf48b29d" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.497240 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.515135 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.516329 4780 scope.go:117] "RemoveContainer" containerID="819824c2b2a688db07e49736ffe21d6a74d3cf926ef2693684565633ee086e4a" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.531087 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:04:13 crc kubenswrapper[4780]: E0929 19:04:13.531750 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="sg-core" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.531782 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="sg-core" Sep 29 19:04:13 crc kubenswrapper[4780]: E0929 19:04:13.531792 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="proxy-httpd" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.531799 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="proxy-httpd" Sep 29 19:04:13 crc kubenswrapper[4780]: E0929 19:04:13.531816 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="ceilometer-notification-agent" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.531827 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="ceilometer-notification-agent" Sep 29 19:04:13 crc kubenswrapper[4780]: E0929 19:04:13.531881 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="ceilometer-central-agent" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.531888 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="ceilometer-central-agent" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.532119 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="proxy-httpd" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.532144 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="sg-core" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.532153 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="ceilometer-notification-agent" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.532177 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" containerName="ceilometer-central-agent" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.534741 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.540166 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.543884 4780 scope.go:117] "RemoveContainer" containerID="d9dbbeb8ced815a83416f8e09c2e323da204a3572113df9346bead6a2c34c96e" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.545182 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.545313 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.545212 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.630987 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.631152 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-config-data\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.631205 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.631246 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m28bl\" (UniqueName: \"kubernetes.io/projected/e42e5bce-9395-4758-8121-35408b6df2e2-kube-api-access-m28bl\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.631268 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-log-httpd\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.631295 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.631330 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-scripts\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.631369 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-run-httpd\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.733531 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-log-httpd\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.733590 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.733624 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-scripts\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.733659 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-run-httpd\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.733681 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.733758 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-config-data\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.733797 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.733829 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m28bl\" (UniqueName: \"kubernetes.io/projected/e42e5bce-9395-4758-8121-35408b6df2e2-kube-api-access-m28bl\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.734541 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-log-httpd\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.739615 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-run-httpd\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.743981 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-scripts\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.745622 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.747200 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.748245 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-config-data\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.748613 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.765532 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m28bl\" (UniqueName: \"kubernetes.io/projected/e42e5bce-9395-4758-8121-35408b6df2e2-kube-api-access-m28bl\") pod \"ceilometer-0\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.875777 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:04:13 crc kubenswrapper[4780]: I0929 19:04:13.971148 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.044330 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-combined-ca-bundle\") pod \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.044518 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-logs\") pod \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.044872 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v6vg\" (UniqueName: \"kubernetes.io/projected/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-kube-api-access-8v6vg\") pod \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.045382 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-config-data\") pod \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.047308 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-logs" (OuterVolumeSpecName: "logs") pod "09ddeea8-91ea-464e-a810-85bcbc6c5cb2" (UID: "09ddeea8-91ea-464e-a810-85bcbc6c5cb2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.052448 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-kube-api-access-8v6vg" (OuterVolumeSpecName: "kube-api-access-8v6vg") pod "09ddeea8-91ea-464e-a810-85bcbc6c5cb2" (UID: "09ddeea8-91ea-464e-a810-85bcbc6c5cb2"). InnerVolumeSpecName "kube-api-access-8v6vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:04:14 crc kubenswrapper[4780]: E0929 19:04:14.079547 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-config-data podName:09ddeea8-91ea-464e-a810-85bcbc6c5cb2 nodeName:}" failed. No retries permitted until 2025-09-29 19:04:14.579514985 +0000 UTC m=+1254.527813029 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-config-data") pod "09ddeea8-91ea-464e-a810-85bcbc6c5cb2" (UID: "09ddeea8-91ea-464e-a810-85bcbc6c5cb2") : error deleting /var/lib/kubelet/pods/09ddeea8-91ea-464e-a810-85bcbc6c5cb2/volume-subpaths: remove /var/lib/kubelet/pods/09ddeea8-91ea-464e-a810-85bcbc6c5cb2/volume-subpaths: no such file or directory Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.083502 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09ddeea8-91ea-464e-a810-85bcbc6c5cb2" (UID: "09ddeea8-91ea-464e-a810-85bcbc6c5cb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.153331 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.153669 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.153684 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v6vg\" (UniqueName: \"kubernetes.io/projected/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-kube-api-access-8v6vg\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.344509 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:04:14 crc kubenswrapper[4780]: W0929 19:04:14.353388 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode42e5bce_9395_4758_8121_35408b6df2e2.slice/crio-edeb186821bfae5ae8de8319bad00b82bcd3b508d07041d3d8663c396821a5e0 WatchSource:0}: Error finding container edeb186821bfae5ae8de8319bad00b82bcd3b508d07041d3d8663c396821a5e0: Status 404 returned error can't find the container with id edeb186821bfae5ae8de8319bad00b82bcd3b508d07041d3d8663c396821a5e0 Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.454673 4780 generic.go:334] "Generic (PLEG): container finished" podID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerID="760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5" exitCode=0 Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.454729 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.454803 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09ddeea8-91ea-464e-a810-85bcbc6c5cb2","Type":"ContainerDied","Data":"760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5"} Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.454906 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"09ddeea8-91ea-464e-a810-85bcbc6c5cb2","Type":"ContainerDied","Data":"582f2c1b650e41d201611f97116a133c43cf564b50fbe5e5373ac170a3528b9e"} Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.454943 4780 scope.go:117] "RemoveContainer" containerID="760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.459190 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerStarted","Data":"edeb186821bfae5ae8de8319bad00b82bcd3b508d07041d3d8663c396821a5e0"} Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.481939 4780 scope.go:117] "RemoveContainer" containerID="9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.501595 4780 scope.go:117] "RemoveContainer" containerID="760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5" Sep 29 19:04:14 crc kubenswrapper[4780]: E0929 19:04:14.502390 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5\": container with ID starting with 760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5 not found: ID does not exist" containerID="760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.502449 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5"} err="failed to get container status \"760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5\": rpc error: code = NotFound desc = could not find container \"760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5\": container with ID starting with 760aef5694aa74a31aa7d7fded752f3fbcef78ceb8968ce367a363fab2ea58e5 not found: ID does not exist" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.502485 4780 scope.go:117] "RemoveContainer" containerID="9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9" Sep 29 19:04:14 crc kubenswrapper[4780]: E0929 19:04:14.503073 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9\": container with ID starting with 9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9 not found: ID does not exist" containerID="9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.503120 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9"} err="failed to get container status \"9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9\": rpc error: code = NotFound desc = could not find container \"9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9\": container with ID starting with 9b2139943fc900549e51af52be0a9dc724912b80f180d2319bd657cd479a1cc9 not found: ID does not exist" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.670633 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-config-data\") pod \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\" (UID: \"09ddeea8-91ea-464e-a810-85bcbc6c5cb2\") " Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.683015 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-config-data" (OuterVolumeSpecName: "config-data") pod "09ddeea8-91ea-464e-a810-85bcbc6c5cb2" (UID: "09ddeea8-91ea-464e-a810-85bcbc6c5cb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.769635 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f2aacb-3391-4cc9-83b2-11fe35ff1ad5" path="/var/lib/kubelet/pods/23f2aacb-3391-4cc9-83b2-11fe35ff1ad5/volumes" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.773685 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09ddeea8-91ea-464e-a810-85bcbc6c5cb2-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.795095 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.805238 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.861825 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:14 crc kubenswrapper[4780]: E0929 19:04:14.869656 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-log" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.871532 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-log" Sep 29 19:04:14 crc kubenswrapper[4780]: E0929 19:04:14.871710 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-api" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.871785 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-api" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.875265 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-log" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.881354 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" containerName="nova-api-api" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.884737 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.884932 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.887670 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.887737 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.890022 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.982252 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.982325 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74d5n\" (UniqueName: \"kubernetes.io/projected/ce6d8c38-8675-444d-be9e-563a1016f412-kube-api-access-74d5n\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.982352 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.982658 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.982900 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-config-data\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:14 crc kubenswrapper[4780]: I0929 19:04:14.983318 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6d8c38-8675-444d-be9e-563a1016f412-logs\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.027400 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.052134 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.085668 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.085728 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-config-data\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.085807 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6d8c38-8675-444d-be9e-563a1016f412-logs\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.085880 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.085945 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74d5n\" (UniqueName: \"kubernetes.io/projected/ce6d8c38-8675-444d-be9e-563a1016f412-kube-api-access-74d5n\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.085969 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.087177 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6d8c38-8675-444d-be9e-563a1016f412-logs\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.091774 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.092004 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.097996 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-config-data\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.100483 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.106190 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74d5n\" (UniqueName: \"kubernetes.io/projected/ce6d8c38-8675-444d-be9e-563a1016f412-kube-api-access-74d5n\") pod \"nova-api-0\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.219349 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.477102 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerStarted","Data":"f88db872bb531d67943f47affb487b5a77c5ff64bdf19d2564052e453ae34187"} Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.499441 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.749673 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pf98d"] Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.751890 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.754801 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.755140 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.764498 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pf98d"] Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.797453 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.803612 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h99lc\" (UniqueName: \"kubernetes.io/projected/15fb3ff6-7863-4831-9720-a4665c09dc82-kube-api-access-h99lc\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.803743 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-scripts\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.803831 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-config-data\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.804295 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.907203 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-config-data\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.907360 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.907408 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h99lc\" (UniqueName: \"kubernetes.io/projected/15fb3ff6-7863-4831-9720-a4665c09dc82-kube-api-access-h99lc\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.907512 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-scripts\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.916029 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-config-data\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.917574 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-scripts\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.933826 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:15 crc kubenswrapper[4780]: I0929 19:04:15.937772 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h99lc\" (UniqueName: \"kubernetes.io/projected/15fb3ff6-7863-4831-9720-a4665c09dc82-kube-api-access-h99lc\") pod \"nova-cell1-cell-mapping-pf98d\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:16 crc kubenswrapper[4780]: I0929 19:04:16.087644 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:16 crc kubenswrapper[4780]: I0929 19:04:16.500809 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6d8c38-8675-444d-be9e-563a1016f412","Type":"ContainerStarted","Data":"94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b"} Sep 29 19:04:16 crc kubenswrapper[4780]: I0929 19:04:16.501613 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6d8c38-8675-444d-be9e-563a1016f412","Type":"ContainerStarted","Data":"1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93"} Sep 29 19:04:16 crc kubenswrapper[4780]: I0929 19:04:16.501632 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6d8c38-8675-444d-be9e-563a1016f412","Type":"ContainerStarted","Data":"9accd7dbbde89ccb840fd3549cfd0ea46d3b1785ff85afb6e147cbe6be802864"} Sep 29 19:04:16 crc kubenswrapper[4780]: I0929 19:04:16.514375 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerStarted","Data":"2206fdfda1b3679c9eaab7892ccf4c32611624a3996175a4dd0502159b261a25"} Sep 29 19:04:16 crc kubenswrapper[4780]: I0929 19:04:16.544437 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.544409574 podStartE2EDuration="2.544409574s" podCreationTimestamp="2025-09-29 19:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:04:16.533531675 +0000 UTC m=+1256.481829719" watchObservedRunningTime="2025-09-29 19:04:16.544409574 +0000 UTC m=+1256.492707618" Sep 29 19:04:16 crc kubenswrapper[4780]: I0929 19:04:16.647654 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pf98d"] Sep 29 19:04:16 crc kubenswrapper[4780]: I0929 19:04:16.772432 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ddeea8-91ea-464e-a810-85bcbc6c5cb2" path="/var/lib/kubelet/pods/09ddeea8-91ea-464e-a810-85bcbc6c5cb2/volumes" Sep 29 19:04:17 crc kubenswrapper[4780]: I0929 19:04:17.527660 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerStarted","Data":"f64e558c1911bea7506b2cdd5c000f9c4c3d8816f4e4b6adc9002538b83090a4"} Sep 29 19:04:17 crc kubenswrapper[4780]: I0929 19:04:17.529525 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pf98d" event={"ID":"15fb3ff6-7863-4831-9720-a4665c09dc82","Type":"ContainerStarted","Data":"bedf0c2e64d32086726c83fc23935b9eb7e3b0cccc9c1ff45f3505778e088224"} Sep 29 19:04:17 crc kubenswrapper[4780]: I0929 19:04:17.529575 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pf98d" event={"ID":"15fb3ff6-7863-4831-9720-a4665c09dc82","Type":"ContainerStarted","Data":"5d8fa909040a280f127ff0b1f8f896af4381f3a8e4a389ac333bfefa4391aaab"} Sep 29 19:04:17 crc kubenswrapper[4780]: I0929 19:04:17.557964 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pf98d" podStartSLOduration=2.557938484 podStartE2EDuration="2.557938484s" podCreationTimestamp="2025-09-29 19:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:04:17.547076526 +0000 UTC m=+1257.495374580" watchObservedRunningTime="2025-09-29 19:04:17.557938484 +0000 UTC m=+1257.506236528" Sep 29 19:04:17 crc kubenswrapper[4780]: I0929 19:04:17.931801 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.052905 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-ktp5p"] Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.053657 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" podUID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" containerName="dnsmasq-dns" containerID="cri-o://768f5d5cf8168347307b5b17a365e35ef6b005b15cbb0e0cec6144bacc00023d" gracePeriod=10 Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.545355 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerStarted","Data":"a13ab8e97bfc1c433e41ba1fdbdc614073a33b3747ee1e7b9e9cd3cb214ce595"} Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.547139 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.550842 4780 generic.go:334] "Generic (PLEG): container finished" podID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" containerID="768f5d5cf8168347307b5b17a365e35ef6b005b15cbb0e0cec6144bacc00023d" exitCode=0 Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.553400 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" event={"ID":"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767","Type":"ContainerDied","Data":"768f5d5cf8168347307b5b17a365e35ef6b005b15cbb0e0cec6144bacc00023d"} Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.587022 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.850067189 podStartE2EDuration="5.587003329s" podCreationTimestamp="2025-09-29 19:04:13 +0000 UTC" firstStartedPulling="2025-09-29 19:04:14.355727133 +0000 UTC m=+1254.304025177" lastFinishedPulling="2025-09-29 19:04:18.092663283 +0000 UTC m=+1258.040961317" observedRunningTime="2025-09-29 19:04:18.574263776 +0000 UTC m=+1258.522561820" watchObservedRunningTime="2025-09-29 19:04:18.587003329 +0000 UTC m=+1258.535301363" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.656248 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.678455 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-nb\") pod \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.678628 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-config\") pod \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.678718 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rflbq\" (UniqueName: \"kubernetes.io/projected/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-kube-api-access-rflbq\") pod \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.678745 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-swift-storage-0\") pod \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.678786 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-sb\") pod \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.678809 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-svc\") pod \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\" (UID: \"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767\") " Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.690516 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-kube-api-access-rflbq" (OuterVolumeSpecName: "kube-api-access-rflbq") pod "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" (UID: "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767"). InnerVolumeSpecName "kube-api-access-rflbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.782453 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rflbq\" (UniqueName: \"kubernetes.io/projected/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-kube-api-access-rflbq\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.792177 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" (UID: "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.805130 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" (UID: "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.816139 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" (UID: "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.817275 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-config" (OuterVolumeSpecName: "config") pod "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" (UID: "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.838642 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" (UID: "b9dbc5a2-e2dc-4ab1-9957-c1120bc77767"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.885532 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.885582 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.885593 4780 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.885604 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:18 crc kubenswrapper[4780]: I0929 19:04:18.885613 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:19 crc kubenswrapper[4780]: I0929 19:04:19.564725 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" event={"ID":"b9dbc5a2-e2dc-4ab1-9957-c1120bc77767","Type":"ContainerDied","Data":"9bb5901c535c70df8891ae9c196062a02c6bad9cb36f403439a2f9cfb71f9768"} Sep 29 19:04:19 crc kubenswrapper[4780]: I0929 19:04:19.564757 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" Sep 29 19:04:19 crc kubenswrapper[4780]: I0929 19:04:19.564818 4780 scope.go:117] "RemoveContainer" containerID="768f5d5cf8168347307b5b17a365e35ef6b005b15cbb0e0cec6144bacc00023d" Sep 29 19:04:19 crc kubenswrapper[4780]: I0929 19:04:19.611115 4780 scope.go:117] "RemoveContainer" containerID="7417c48c704e193717034fa2316dd912006f1d6915312181a0070cefc0aec660" Sep 29 19:04:19 crc kubenswrapper[4780]: I0929 19:04:19.637152 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-ktp5p"] Sep 29 19:04:19 crc kubenswrapper[4780]: I0929 19:04:19.646894 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d9cc4c77f-ktp5p"] Sep 29 19:04:20 crc kubenswrapper[4780]: I0929 19:04:20.781363 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" path="/var/lib/kubelet/pods/b9dbc5a2-e2dc-4ab1-9957-c1120bc77767/volumes" Sep 29 19:04:23 crc kubenswrapper[4780]: I0929 19:04:23.407405 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d9cc4c77f-ktp5p" podUID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: i/o timeout" Sep 29 19:04:23 crc kubenswrapper[4780]: I0929 19:04:23.616928 4780 generic.go:334] "Generic (PLEG): container finished" podID="15fb3ff6-7863-4831-9720-a4665c09dc82" containerID="bedf0c2e64d32086726c83fc23935b9eb7e3b0cccc9c1ff45f3505778e088224" exitCode=0 Sep 29 19:04:23 crc kubenswrapper[4780]: I0929 19:04:23.616981 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pf98d" event={"ID":"15fb3ff6-7863-4831-9720-a4665c09dc82","Type":"ContainerDied","Data":"bedf0c2e64d32086726c83fc23935b9eb7e3b0cccc9c1ff45f3505778e088224"} Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.033571 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.142401 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-combined-ca-bundle\") pod \"15fb3ff6-7863-4831-9720-a4665c09dc82\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.142986 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h99lc\" (UniqueName: \"kubernetes.io/projected/15fb3ff6-7863-4831-9720-a4665c09dc82-kube-api-access-h99lc\") pod \"15fb3ff6-7863-4831-9720-a4665c09dc82\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.143900 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-config-data\") pod \"15fb3ff6-7863-4831-9720-a4665c09dc82\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.144147 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-scripts\") pod \"15fb3ff6-7863-4831-9720-a4665c09dc82\" (UID: \"15fb3ff6-7863-4831-9720-a4665c09dc82\") " Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.150231 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-scripts" (OuterVolumeSpecName: "scripts") pod "15fb3ff6-7863-4831-9720-a4665c09dc82" (UID: "15fb3ff6-7863-4831-9720-a4665c09dc82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.150743 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fb3ff6-7863-4831-9720-a4665c09dc82-kube-api-access-h99lc" (OuterVolumeSpecName: "kube-api-access-h99lc") pod "15fb3ff6-7863-4831-9720-a4665c09dc82" (UID: "15fb3ff6-7863-4831-9720-a4665c09dc82"). InnerVolumeSpecName "kube-api-access-h99lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.175991 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-config-data" (OuterVolumeSpecName: "config-data") pod "15fb3ff6-7863-4831-9720-a4665c09dc82" (UID: "15fb3ff6-7863-4831-9720-a4665c09dc82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.178457 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15fb3ff6-7863-4831-9720-a4665c09dc82" (UID: "15fb3ff6-7863-4831-9720-a4665c09dc82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.220576 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.220630 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.247622 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.247650 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.247660 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fb3ff6-7863-4831-9720-a4665c09dc82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.247670 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h99lc\" (UniqueName: \"kubernetes.io/projected/15fb3ff6-7863-4831-9720-a4665c09dc82-kube-api-access-h99lc\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.642644 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pf98d" event={"ID":"15fb3ff6-7863-4831-9720-a4665c09dc82","Type":"ContainerDied","Data":"5d8fa909040a280f127ff0b1f8f896af4381f3a8e4a389ac333bfefa4391aaab"} Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.642709 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d8fa909040a280f127ff0b1f8f896af4381f3a8e4a389ac333bfefa4391aaab" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.642712 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pf98d" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.753686 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.753902 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-log" containerID="cri-o://1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93" gracePeriod=30 Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.754108 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-api" containerID="cri-o://94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b" gracePeriod=30 Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.763509 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.763730 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b382cd38-acb9-4516-b2f6-ce8fc385752b" containerName="nova-scheduler-scheduler" containerID="cri-o://671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b" gracePeriod=30 Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.768894 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": EOF" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.768894 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": EOF" Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.779145 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.779383 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-log" containerID="cri-o://59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e" gracePeriod=30 Sep 29 19:04:25 crc kubenswrapper[4780]: I0929 19:04:25.779822 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-metadata" containerID="cri-o://1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d" gracePeriod=30 Sep 29 19:04:26 crc kubenswrapper[4780]: E0929 19:04:26.612483 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 19:04:26 crc kubenswrapper[4780]: E0929 19:04:26.613941 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 19:04:26 crc kubenswrapper[4780]: E0929 19:04:26.616439 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 19:04:26 crc kubenswrapper[4780]: E0929 19:04:26.616472 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b382cd38-acb9-4516-b2f6-ce8fc385752b" containerName="nova-scheduler-scheduler" Sep 29 19:04:26 crc kubenswrapper[4780]: I0929 19:04:26.653910 4780 generic.go:334] "Generic (PLEG): container finished" podID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerID="59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e" exitCode=143 Sep 29 19:04:26 crc kubenswrapper[4780]: I0929 19:04:26.653993 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12185efa-dd2b-4c18-ab7d-f05c1f123c30","Type":"ContainerDied","Data":"59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e"} Sep 29 19:04:26 crc kubenswrapper[4780]: I0929 19:04:26.657247 4780 generic.go:334] "Generic (PLEG): container finished" podID="ce6d8c38-8675-444d-be9e-563a1016f412" containerID="1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93" exitCode=143 Sep 29 19:04:26 crc kubenswrapper[4780]: I0929 19:04:26.657281 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6d8c38-8675-444d-be9e-563a1016f412","Type":"ContainerDied","Data":"1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93"} Sep 29 19:04:28 crc kubenswrapper[4780]: I0929 19:04:28.926174 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:45240->10.217.0.192:8775: read: connection reset by peer" Sep 29 19:04:28 crc kubenswrapper[4780]: I0929 19:04:28.926183 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:45250->10.217.0.192:8775: read: connection reset by peer" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.390841 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.447400 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-config-data\") pod \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.447507 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12185efa-dd2b-4c18-ab7d-f05c1f123c30-logs\") pod \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.447660 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-nova-metadata-tls-certs\") pod \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.447802 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-combined-ca-bundle\") pod \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.447858 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27jxv\" (UniqueName: \"kubernetes.io/projected/12185efa-dd2b-4c18-ab7d-f05c1f123c30-kube-api-access-27jxv\") pod \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\" (UID: \"12185efa-dd2b-4c18-ab7d-f05c1f123c30\") " Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.450542 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12185efa-dd2b-4c18-ab7d-f05c1f123c30-logs" (OuterVolumeSpecName: "logs") pod "12185efa-dd2b-4c18-ab7d-f05c1f123c30" (UID: "12185efa-dd2b-4c18-ab7d-f05c1f123c30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.458833 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12185efa-dd2b-4c18-ab7d-f05c1f123c30-kube-api-access-27jxv" (OuterVolumeSpecName: "kube-api-access-27jxv") pod "12185efa-dd2b-4c18-ab7d-f05c1f123c30" (UID: "12185efa-dd2b-4c18-ab7d-f05c1f123c30"). InnerVolumeSpecName "kube-api-access-27jxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.487243 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-config-data" (OuterVolumeSpecName: "config-data") pod "12185efa-dd2b-4c18-ab7d-f05c1f123c30" (UID: "12185efa-dd2b-4c18-ab7d-f05c1f123c30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.495459 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12185efa-dd2b-4c18-ab7d-f05c1f123c30" (UID: "12185efa-dd2b-4c18-ab7d-f05c1f123c30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.523593 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "12185efa-dd2b-4c18-ab7d-f05c1f123c30" (UID: "12185efa-dd2b-4c18-ab7d-f05c1f123c30"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.551576 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.551623 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27jxv\" (UniqueName: \"kubernetes.io/projected/12185efa-dd2b-4c18-ab7d-f05c1f123c30-kube-api-access-27jxv\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.551639 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.551650 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12185efa-dd2b-4c18-ab7d-f05c1f123c30-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.551663 4780 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12185efa-dd2b-4c18-ab7d-f05c1f123c30-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.695989 4780 generic.go:334] "Generic (PLEG): container finished" podID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerID="1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d" exitCode=0 Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.696108 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.696103 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12185efa-dd2b-4c18-ab7d-f05c1f123c30","Type":"ContainerDied","Data":"1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d"} Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.696320 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12185efa-dd2b-4c18-ab7d-f05c1f123c30","Type":"ContainerDied","Data":"939601ac92c249e98e467279ef01778165cedbe0b5b64213ff6e7c308789ba98"} Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.696343 4780 scope.go:117] "RemoveContainer" containerID="1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.737694 4780 scope.go:117] "RemoveContainer" containerID="59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.744732 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.762183 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.771798 4780 scope.go:117] "RemoveContainer" containerID="1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d" Sep 29 19:04:29 crc kubenswrapper[4780]: E0929 19:04:29.772585 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d\": container with ID starting with 1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d not found: ID does not exist" containerID="1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.772631 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d"} err="failed to get container status \"1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d\": rpc error: code = NotFound desc = could not find container \"1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d\": container with ID starting with 1839ab63c11edd8fe6830ab2dd0a5be8a16bd334b111acdb25c1450f758b614d not found: ID does not exist" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.772661 4780 scope.go:117] "RemoveContainer" containerID="59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.774199 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:04:29 crc kubenswrapper[4780]: E0929 19:04:29.774357 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e\": container with ID starting with 59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e not found: ID does not exist" containerID="59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.774388 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e"} err="failed to get container status \"59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e\": rpc error: code = NotFound desc = could not find container \"59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e\": container with ID starting with 59e709586a96b463a2eaabdc704e4f2b71f4d7dbd908a6729326a7b465bb957e not found: ID does not exist" Sep 29 19:04:29 crc kubenswrapper[4780]: E0929 19:04:29.775025 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" containerName="init" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.775113 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" containerName="init" Sep 29 19:04:29 crc kubenswrapper[4780]: E0929 19:04:29.775186 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-metadata" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.775233 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-metadata" Sep 29 19:04:29 crc kubenswrapper[4780]: E0929 19:04:29.775308 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" containerName="dnsmasq-dns" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.775378 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" containerName="dnsmasq-dns" Sep 29 19:04:29 crc kubenswrapper[4780]: E0929 19:04:29.775440 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fb3ff6-7863-4831-9720-a4665c09dc82" containerName="nova-manage" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.775486 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fb3ff6-7863-4831-9720-a4665c09dc82" containerName="nova-manage" Sep 29 19:04:29 crc kubenswrapper[4780]: E0929 19:04:29.775545 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-log" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.775589 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-log" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.775857 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-metadata" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.775956 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fb3ff6-7863-4831-9720-a4665c09dc82" containerName="nova-manage" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.776023 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9dbc5a2-e2dc-4ab1-9957-c1120bc77767" containerName="dnsmasq-dns" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.776207 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" containerName="nova-metadata-log" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.777630 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.782655 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.783397 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.786472 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.857674 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.857934 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.858093 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-config-data\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.858289 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-logs\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.858654 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qjvh\" (UniqueName: \"kubernetes.io/projected/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-kube-api-access-9qjvh\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.961067 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.961134 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-config-data\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.961199 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-logs\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.961271 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qjvh\" (UniqueName: \"kubernetes.io/projected/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-kube-api-access-9qjvh\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.961368 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.962613 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-logs\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.966620 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.966741 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-config-data\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.967123 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:29 crc kubenswrapper[4780]: I0929 19:04:29.980756 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qjvh\" (UniqueName: \"kubernetes.io/projected/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-kube-api-access-9qjvh\") pod \"nova-metadata-0\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " pod="openstack/nova-metadata-0" Sep 29 19:04:30 crc kubenswrapper[4780]: I0929 19:04:30.105129 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:04:30 crc kubenswrapper[4780]: I0929 19:04:30.599284 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:04:30 crc kubenswrapper[4780]: I0929 19:04:30.710939 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248","Type":"ContainerStarted","Data":"18d14c63a50f5d576243af21f472ea4e7691a4771c90ea3fd9fda8316001056f"} Sep 29 19:04:30 crc kubenswrapper[4780]: I0929 19:04:30.769903 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12185efa-dd2b-4c18-ab7d-f05c1f123c30" path="/var/lib/kubelet/pods/12185efa-dd2b-4c18-ab7d-f05c1f123c30/volumes" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.398599 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.503680 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-combined-ca-bundle\") pod \"b382cd38-acb9-4516-b2f6-ce8fc385752b\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.503765 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-config-data\") pod \"b382cd38-acb9-4516-b2f6-ce8fc385752b\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.503807 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjch9\" (UniqueName: \"kubernetes.io/projected/b382cd38-acb9-4516-b2f6-ce8fc385752b-kube-api-access-fjch9\") pod \"b382cd38-acb9-4516-b2f6-ce8fc385752b\" (UID: \"b382cd38-acb9-4516-b2f6-ce8fc385752b\") " Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.522117 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b382cd38-acb9-4516-b2f6-ce8fc385752b-kube-api-access-fjch9" (OuterVolumeSpecName: "kube-api-access-fjch9") pod "b382cd38-acb9-4516-b2f6-ce8fc385752b" (UID: "b382cd38-acb9-4516-b2f6-ce8fc385752b"). InnerVolumeSpecName "kube-api-access-fjch9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.538623 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b382cd38-acb9-4516-b2f6-ce8fc385752b" (UID: "b382cd38-acb9-4516-b2f6-ce8fc385752b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.550854 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-config-data" (OuterVolumeSpecName: "config-data") pod "b382cd38-acb9-4516-b2f6-ce8fc385752b" (UID: "b382cd38-acb9-4516-b2f6-ce8fc385752b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.588083 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.608447 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.608491 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382cd38-acb9-4516-b2f6-ce8fc385752b-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.608530 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjch9\" (UniqueName: \"kubernetes.io/projected/b382cd38-acb9-4516-b2f6-ce8fc385752b-kube-api-access-fjch9\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.710280 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74d5n\" (UniqueName: \"kubernetes.io/projected/ce6d8c38-8675-444d-be9e-563a1016f412-kube-api-access-74d5n\") pod \"ce6d8c38-8675-444d-be9e-563a1016f412\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.710638 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-config-data\") pod \"ce6d8c38-8675-444d-be9e-563a1016f412\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.710749 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-combined-ca-bundle\") pod \"ce6d8c38-8675-444d-be9e-563a1016f412\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.710837 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6d8c38-8675-444d-be9e-563a1016f412-logs\") pod \"ce6d8c38-8675-444d-be9e-563a1016f412\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.711108 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-internal-tls-certs\") pod \"ce6d8c38-8675-444d-be9e-563a1016f412\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.711214 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-public-tls-certs\") pod \"ce6d8c38-8675-444d-be9e-563a1016f412\" (UID: \"ce6d8c38-8675-444d-be9e-563a1016f412\") " Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.712108 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce6d8c38-8675-444d-be9e-563a1016f412-logs" (OuterVolumeSpecName: "logs") pod "ce6d8c38-8675-444d-be9e-563a1016f412" (UID: "ce6d8c38-8675-444d-be9e-563a1016f412"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.716234 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce6d8c38-8675-444d-be9e-563a1016f412-kube-api-access-74d5n" (OuterVolumeSpecName: "kube-api-access-74d5n") pod "ce6d8c38-8675-444d-be9e-563a1016f412" (UID: "ce6d8c38-8675-444d-be9e-563a1016f412"). InnerVolumeSpecName "kube-api-access-74d5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.728689 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248","Type":"ContainerStarted","Data":"7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1"} Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.728782 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248","Type":"ContainerStarted","Data":"a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a"} Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.732508 4780 generic.go:334] "Generic (PLEG): container finished" podID="ce6d8c38-8675-444d-be9e-563a1016f412" containerID="94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b" exitCode=0 Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.733277 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.733661 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6d8c38-8675-444d-be9e-563a1016f412","Type":"ContainerDied","Data":"94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b"} Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.733846 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6d8c38-8675-444d-be9e-563a1016f412","Type":"ContainerDied","Data":"9accd7dbbde89ccb840fd3549cfd0ea46d3b1785ff85afb6e147cbe6be802864"} Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.733895 4780 scope.go:117] "RemoveContainer" containerID="94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.736555 4780 generic.go:334] "Generic (PLEG): container finished" podID="b382cd38-acb9-4516-b2f6-ce8fc385752b" containerID="671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b" exitCode=0 Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.736607 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b382cd38-acb9-4516-b2f6-ce8fc385752b","Type":"ContainerDied","Data":"671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b"} Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.736639 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b382cd38-acb9-4516-b2f6-ce8fc385752b","Type":"ContainerDied","Data":"0c184071c3daec77177a497f3a60c5c94aa9bfec21755d0795a884b1b1bb93d0"} Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.737018 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.739459 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-config-data" (OuterVolumeSpecName: "config-data") pod "ce6d8c38-8675-444d-be9e-563a1016f412" (UID: "ce6d8c38-8675-444d-be9e-563a1016f412"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.745373 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce6d8c38-8675-444d-be9e-563a1016f412" (UID: "ce6d8c38-8675-444d-be9e-563a1016f412"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.759163 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.759128592 podStartE2EDuration="2.759128592s" podCreationTimestamp="2025-09-29 19:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:04:31.75462991 +0000 UTC m=+1271.702927964" watchObservedRunningTime="2025-09-29 19:04:31.759128592 +0000 UTC m=+1271.707426636" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.765273 4780 scope.go:117] "RemoveContainer" containerID="1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.790343 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.797332 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ce6d8c38-8675-444d-be9e-563a1016f412" (UID: "ce6d8c38-8675-444d-be9e-563a1016f412"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.797405 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ce6d8c38-8675-444d-be9e-563a1016f412" (UID: "ce6d8c38-8675-444d-be9e-563a1016f412"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.806409 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.808955 4780 scope.go:117] "RemoveContainer" containerID="94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b" Sep 29 19:04:31 crc kubenswrapper[4780]: E0929 19:04:31.811561 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b\": container with ID starting with 94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b not found: ID does not exist" containerID="94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.811614 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b"} err="failed to get container status \"94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b\": rpc error: code = NotFound desc = could not find container \"94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b\": container with ID starting with 94a31741cae91dcc54f587291e2f19332f155f0ee9380eea692920a343b54d8b not found: ID does not exist" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.811648 4780 scope.go:117] "RemoveContainer" containerID="1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93" Sep 29 19:04:31 crc kubenswrapper[4780]: E0929 19:04:31.812064 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93\": container with ID starting with 1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93 not found: ID does not exist" containerID="1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.812107 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93"} err="failed to get container status \"1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93\": rpc error: code = NotFound desc = could not find container \"1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93\": container with ID starting with 1038cca432c3a4e3098597c53eaac867d147ff02e0a5c529211f2921ac17df93 not found: ID does not exist" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.812138 4780 scope.go:117] "RemoveContainer" containerID="671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.813910 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74d5n\" (UniqueName: \"kubernetes.io/projected/ce6d8c38-8675-444d-be9e-563a1016f412-kube-api-access-74d5n\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.813932 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.813944 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.813955 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6d8c38-8675-444d-be9e-563a1016f412-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.813966 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.813975 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6d8c38-8675-444d-be9e-563a1016f412-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.821485 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:04:31 crc kubenswrapper[4780]: E0929 19:04:31.822187 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-api" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.822212 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-api" Sep 29 19:04:31 crc kubenswrapper[4780]: E0929 19:04:31.822231 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b382cd38-acb9-4516-b2f6-ce8fc385752b" containerName="nova-scheduler-scheduler" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.822241 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b382cd38-acb9-4516-b2f6-ce8fc385752b" containerName="nova-scheduler-scheduler" Sep 29 19:04:31 crc kubenswrapper[4780]: E0929 19:04:31.822261 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-log" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.822268 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-log" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.822521 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b382cd38-acb9-4516-b2f6-ce8fc385752b" containerName="nova-scheduler-scheduler" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.822554 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-api" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.822574 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" containerName="nova-api-log" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.823508 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.825890 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.832428 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.863317 4780 scope.go:117] "RemoveContainer" containerID="671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b" Sep 29 19:04:31 crc kubenswrapper[4780]: E0929 19:04:31.864072 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b\": container with ID starting with 671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b not found: ID does not exist" containerID="671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.864115 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b"} err="failed to get container status \"671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b\": rpc error: code = NotFound desc = could not find container \"671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b\": container with ID starting with 671ccf1b4f6528d378cf45d308563f89cb3575c0255cf2b796178bdcb968d11b not found: ID does not exist" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.915416 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " pod="openstack/nova-scheduler-0" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.915598 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-config-data\") pod \"nova-scheduler-0\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " pod="openstack/nova-scheduler-0" Sep 29 19:04:31 crc kubenswrapper[4780]: I0929 19:04:31.915640 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxfxd\" (UniqueName: \"kubernetes.io/projected/ec846e3f-c11b-4818-a15b-9f855ed48a56-kube-api-access-hxfxd\") pod \"nova-scheduler-0\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " pod="openstack/nova-scheduler-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.017749 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " pod="openstack/nova-scheduler-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.017915 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-config-data\") pod \"nova-scheduler-0\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " pod="openstack/nova-scheduler-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.017944 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxfxd\" (UniqueName: \"kubernetes.io/projected/ec846e3f-c11b-4818-a15b-9f855ed48a56-kube-api-access-hxfxd\") pod \"nova-scheduler-0\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " pod="openstack/nova-scheduler-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.022387 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-config-data\") pod \"nova-scheduler-0\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " pod="openstack/nova-scheduler-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.022819 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " pod="openstack/nova-scheduler-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.043115 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxfxd\" (UniqueName: \"kubernetes.io/projected/ec846e3f-c11b-4818-a15b-9f855ed48a56-kube-api-access-hxfxd\") pod \"nova-scheduler-0\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " pod="openstack/nova-scheduler-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.142340 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.143193 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.153970 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.166807 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.168941 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.174286 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.174350 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.174661 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.186488 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.222435 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-config-data\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.222510 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-public-tls-certs\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.222574 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.222654 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02521078-2e58-4ce2-bc12-0b6c3b2ed878-logs\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.222729 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxxr\" (UniqueName: \"kubernetes.io/projected/02521078-2e58-4ce2-bc12-0b6c3b2ed878-kube-api-access-kmxxr\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.222810 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-internal-tls-certs\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.325399 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-internal-tls-certs\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.325979 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-config-data\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.326058 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-public-tls-certs\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.326155 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.326244 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02521078-2e58-4ce2-bc12-0b6c3b2ed878-logs\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.326383 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxxr\" (UniqueName: \"kubernetes.io/projected/02521078-2e58-4ce2-bc12-0b6c3b2ed878-kube-api-access-kmxxr\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.327059 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02521078-2e58-4ce2-bc12-0b6c3b2ed878-logs\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.334155 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-internal-tls-certs\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.336962 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.338953 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-public-tls-certs\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.342572 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-config-data\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.343466 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxxr\" (UniqueName: \"kubernetes.io/projected/02521078-2e58-4ce2-bc12-0b6c3b2ed878-kube-api-access-kmxxr\") pod \"nova-api-0\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.572736 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.618986 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:04:32 crc kubenswrapper[4780]: W0929 19:04:32.629342 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec846e3f_c11b_4818_a15b_9f855ed48a56.slice/crio-f9fd512f737c293fc7bcf9992914bb134985170fe05c6a33fc8f247beb3e2550 WatchSource:0}: Error finding container f9fd512f737c293fc7bcf9992914bb134985170fe05c6a33fc8f247beb3e2550: Status 404 returned error can't find the container with id f9fd512f737c293fc7bcf9992914bb134985170fe05c6a33fc8f247beb3e2550 Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.781473 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b382cd38-acb9-4516-b2f6-ce8fc385752b" path="/var/lib/kubelet/pods/b382cd38-acb9-4516-b2f6-ce8fc385752b/volumes" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.782800 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce6d8c38-8675-444d-be9e-563a1016f412" path="/var/lib/kubelet/pods/ce6d8c38-8675-444d-be9e-563a1016f412/volumes" Sep 29 19:04:32 crc kubenswrapper[4780]: I0929 19:04:32.783620 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec846e3f-c11b-4818-a15b-9f855ed48a56","Type":"ContainerStarted","Data":"f9fd512f737c293fc7bcf9992914bb134985170fe05c6a33fc8f247beb3e2550"} Sep 29 19:04:33 crc kubenswrapper[4780]: I0929 19:04:33.068102 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:04:33 crc kubenswrapper[4780]: W0929 19:04:33.091451 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02521078_2e58_4ce2_bc12_0b6c3b2ed878.slice/crio-226241802adc1b66e53d945d547c153f613a5ab0f03dba72a8f016df1608204b WatchSource:0}: Error finding container 226241802adc1b66e53d945d547c153f613a5ab0f03dba72a8f016df1608204b: Status 404 returned error can't find the container with id 226241802adc1b66e53d945d547c153f613a5ab0f03dba72a8f016df1608204b Sep 29 19:04:33 crc kubenswrapper[4780]: I0929 19:04:33.223086 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:04:33 crc kubenswrapper[4780]: I0929 19:04:33.223178 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:04:33 crc kubenswrapper[4780]: I0929 19:04:33.792617 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec846e3f-c11b-4818-a15b-9f855ed48a56","Type":"ContainerStarted","Data":"06b644ef5b1ab2aed1b81290fa9144d38c32c66e7d427c70b6dfb41dd252e0ac"} Sep 29 19:04:33 crc kubenswrapper[4780]: I0929 19:04:33.795118 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"02521078-2e58-4ce2-bc12-0b6c3b2ed878","Type":"ContainerStarted","Data":"3f91861bf876fab40cbb103b723d6d21a4fb3a2aeadedbbda6de0035a6ee2aa7"} Sep 29 19:04:33 crc kubenswrapper[4780]: I0929 19:04:33.795153 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"02521078-2e58-4ce2-bc12-0b6c3b2ed878","Type":"ContainerStarted","Data":"aae9731ab2bae2a8e2eb268dc27032c196b1db8b2299a7d349eab203f6ba9217"} Sep 29 19:04:33 crc kubenswrapper[4780]: I0929 19:04:33.795176 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"02521078-2e58-4ce2-bc12-0b6c3b2ed878","Type":"ContainerStarted","Data":"226241802adc1b66e53d945d547c153f613a5ab0f03dba72a8f016df1608204b"} Sep 29 19:04:33 crc kubenswrapper[4780]: I0929 19:04:33.813783 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.813763771 podStartE2EDuration="2.813763771s" podCreationTimestamp="2025-09-29 19:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:04:33.809761864 +0000 UTC m=+1273.758059908" watchObservedRunningTime="2025-09-29 19:04:33.813763771 +0000 UTC m=+1273.762061815" Sep 29 19:04:33 crc kubenswrapper[4780]: I0929 19:04:33.843110 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.843082511 podStartE2EDuration="1.843082511s" podCreationTimestamp="2025-09-29 19:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:04:33.829888494 +0000 UTC m=+1273.778186538" watchObservedRunningTime="2025-09-29 19:04:33.843082511 +0000 UTC m=+1273.791380555" Sep 29 19:04:35 crc kubenswrapper[4780]: I0929 19:04:35.105534 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 19:04:35 crc kubenswrapper[4780]: I0929 19:04:35.105817 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 29 19:04:37 crc kubenswrapper[4780]: I0929 19:04:37.143537 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 29 19:04:40 crc kubenswrapper[4780]: I0929 19:04:40.106102 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 19:04:40 crc kubenswrapper[4780]: I0929 19:04:40.106675 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 29 19:04:41 crc kubenswrapper[4780]: I0929 19:04:41.123597 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 19:04:41 crc kubenswrapper[4780]: I0929 19:04:41.123667 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 19:04:42 crc kubenswrapper[4780]: I0929 19:04:42.143430 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 29 19:04:42 crc kubenswrapper[4780]: I0929 19:04:42.173226 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 29 19:04:42 crc kubenswrapper[4780]: I0929 19:04:42.573893 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 19:04:42 crc kubenswrapper[4780]: I0929 19:04:42.574522 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 29 19:04:42 crc kubenswrapper[4780]: I0929 19:04:42.977572 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 29 19:04:43 crc kubenswrapper[4780]: I0929 19:04:43.592249 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 19:04:43 crc kubenswrapper[4780]: I0929 19:04:43.592249 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 29 19:04:43 crc kubenswrapper[4780]: I0929 19:04:43.885998 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 29 19:04:50 crc kubenswrapper[4780]: I0929 19:04:50.113789 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 19:04:50 crc kubenswrapper[4780]: I0929 19:04:50.114895 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 29 19:04:50 crc kubenswrapper[4780]: I0929 19:04:50.120205 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 19:04:51 crc kubenswrapper[4780]: I0929 19:04:51.040121 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 29 19:04:52 crc kubenswrapper[4780]: I0929 19:04:52.582804 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 19:04:52 crc kubenswrapper[4780]: I0929 19:04:52.583364 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 29 19:04:52 crc kubenswrapper[4780]: I0929 19:04:52.584095 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 19:04:52 crc kubenswrapper[4780]: I0929 19:04:52.584757 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 29 19:04:52 crc kubenswrapper[4780]: I0929 19:04:52.596505 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 19:04:52 crc kubenswrapper[4780]: I0929 19:04:52.602996 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 29 19:05:03 crc kubenswrapper[4780]: I0929 19:05:03.223319 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:05:03 crc kubenswrapper[4780]: I0929 19:05:03.223882 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:05:03 crc kubenswrapper[4780]: I0929 19:05:03.223924 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:05:03 crc kubenswrapper[4780]: I0929 19:05:03.224700 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"158b296bb0b637f86ad18136c175af2360d991a7d6ae9ac64ec4dd848661493a"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:05:03 crc kubenswrapper[4780]: I0929 19:05:03.224757 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://158b296bb0b637f86ad18136c175af2360d991a7d6ae9ac64ec4dd848661493a" gracePeriod=600 Sep 29 19:05:04 crc kubenswrapper[4780]: I0929 19:05:04.166503 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="158b296bb0b637f86ad18136c175af2360d991a7d6ae9ac64ec4dd848661493a" exitCode=0 Sep 29 19:05:04 crc kubenswrapper[4780]: I0929 19:05:04.166535 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"158b296bb0b637f86ad18136c175af2360d991a7d6ae9ac64ec4dd848661493a"} Sep 29 19:05:04 crc kubenswrapper[4780]: I0929 19:05:04.167017 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d"} Sep 29 19:05:04 crc kubenswrapper[4780]: I0929 19:05:04.167055 4780 scope.go:117] "RemoveContainer" containerID="f026a57b468a10b5696a1d13800dd6d4186b4cd22425cdfb1197806a9210b5dc" Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.477293 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.478141 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="3608c7b9-1f29-491f-9a10-48135b074fa4" containerName="openstackclient" containerID="cri-o://f9639917e8369f971bbda3bb865e49a2021d379743b2c56bab33019529c9a847" gracePeriod=2 Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.491818 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.533443 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.670944 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.671275 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerName="cinder-scheduler" containerID="cri-o://2bed75a72ebe1e8129dcb4991c90de2501dc80eaaa5a5d00e170c5bcd8aefd4f" gracePeriod=30 Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.671880 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerName="probe" containerID="cri-o://dbcf9928788092ab977ae712ae4612aa67a6f7d49fb7301d908346b1aca4b563" gracePeriod=30 Sep 29 19:05:09 crc kubenswrapper[4780]: E0929 19:05:09.675805 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 29 19:05:09 crc kubenswrapper[4780]: E0929 19:05:09.675873 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data podName:d2ee2741-9417-4698-b550-7c596d00d271 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:10.175852332 +0000 UTC m=+1310.124150376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data") pod "rabbitmq-server-0" (UID: "d2ee2741-9417-4698-b550-7c596d00d271") : configmap "rabbitmq-config-data" not found Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.761201 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.811199 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hzb5x"] Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.829939 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8vsrs"] Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.830201 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-8vsrs" podUID="00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" containerName="openstack-network-exporter" containerID="cri-o://13a5524647bbab0fbbfe370f379b83106caeed1a42a7ccefb228fba784b1ddf7" gracePeriod=30 Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.858103 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.858416 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api-log" containerID="cri-o://aae1a7e720cb23ff6cab4d895d4d7d7fe47acc5b243d3c4f6eaa4b6fe46a9e00" gracePeriod=30 Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.858589 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api" containerID="cri-o://391cc111e8cd575fa81674aac39e64f1e0c3b2f3fc46853f4758411b706b35aa" gracePeriod=30 Sep 29 19:05:09 crc kubenswrapper[4780]: I0929 19:05:09.915853 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tqkx6"] Sep 29 19:05:09 crc kubenswrapper[4780]: E0929 19:05:09.985792 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:09 crc kubenswrapper[4780]: E0929 19:05:09.985884 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data podName:b90472c3-a09d-433c-922b-d164a11636e6 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:10.485863353 +0000 UTC m=+1310.434161397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b90472c3-a09d-433c-922b-d164a11636e6") : configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.023673 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementb644-account-delete-t2p8t"] Sep 29 19:05:10 crc kubenswrapper[4780]: E0929 19:05:10.024302 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3608c7b9-1f29-491f-9a10-48135b074fa4" containerName="openstackclient" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.024459 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3608c7b9-1f29-491f-9a10-48135b074fa4" containerName="openstackclient" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.024699 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3608c7b9-1f29-491f-9a10-48135b074fa4" containerName="openstackclient" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.025588 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementb644-account-delete-t2p8t" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.038141 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementb644-account-delete-t2p8t"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.143593 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican32a4-account-delete-xckl5"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.145685 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican32a4-account-delete-xckl5" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.240260 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlt7x\" (UniqueName: \"kubernetes.io/projected/503714fd-6dcf-4b1d-8806-dd78a3e85b7f-kube-api-access-vlt7x\") pod \"placementb644-account-delete-t2p8t\" (UID: \"503714fd-6dcf-4b1d-8806-dd78a3e85b7f\") " pod="openstack/placementb644-account-delete-t2p8t" Sep 29 19:05:10 crc kubenswrapper[4780]: E0929 19:05:10.240624 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 29 19:05:10 crc kubenswrapper[4780]: E0929 19:05:10.240684 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data podName:d2ee2741-9417-4698-b550-7c596d00d271 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:11.240667515 +0000 UTC m=+1311.188965559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data") pod "rabbitmq-server-0" (UID: "d2ee2741-9417-4698-b550-7c596d00d271") : configmap "rabbitmq-config-data" not found Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.323868 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8vsrs_00f3e1c2-9a7e-42d1-8aa8-396285ea40c8/openstack-network-exporter/0.log" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.323915 4780 generic.go:334] "Generic (PLEG): container finished" podID="00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" containerID="13a5524647bbab0fbbfe370f379b83106caeed1a42a7ccefb228fba784b1ddf7" exitCode=2 Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.324001 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8vsrs" event={"ID":"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8","Type":"ContainerDied","Data":"13a5524647bbab0fbbfe370f379b83106caeed1a42a7ccefb228fba784b1ddf7"} Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.345102 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican32a4-account-delete-xckl5"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.348595 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czx87\" (UniqueName: \"kubernetes.io/projected/5d5ccc95-6c2c-4f3c-884b-456cf28d6db4-kube-api-access-czx87\") pod \"barbican32a4-account-delete-xckl5\" (UID: \"5d5ccc95-6c2c-4f3c-884b-456cf28d6db4\") " pod="openstack/barbican32a4-account-delete-xckl5" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.348671 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlt7x\" (UniqueName: \"kubernetes.io/projected/503714fd-6dcf-4b1d-8806-dd78a3e85b7f-kube-api-access-vlt7x\") pod \"placementb644-account-delete-t2p8t\" (UID: \"503714fd-6dcf-4b1d-8806-dd78a3e85b7f\") " pod="openstack/placementb644-account-delete-t2p8t" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.379569 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerID="aae1a7e720cb23ff6cab4d895d4d7d7fe47acc5b243d3c4f6eaa4b6fe46a9e00" exitCode=143 Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.379629 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d","Type":"ContainerDied","Data":"aae1a7e720cb23ff6cab4d895d4d7d7fe47acc5b243d3c4f6eaa4b6fe46a9e00"} Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.409582 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi0c9e-account-delete-l4t5r"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.410996 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi0c9e-account-delete-l4t5r" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.422108 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.443111 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi0c9e-account-delete-l4t5r"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.455578 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlt7x\" (UniqueName: \"kubernetes.io/projected/503714fd-6dcf-4b1d-8806-dd78a3e85b7f-kube-api-access-vlt7x\") pod \"placementb644-account-delete-t2p8t\" (UID: \"503714fd-6dcf-4b1d-8806-dd78a3e85b7f\") " pod="openstack/placementb644-account-delete-t2p8t" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.456869 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czx87\" (UniqueName: \"kubernetes.io/projected/5d5ccc95-6c2c-4f3c-884b-456cf28d6db4-kube-api-access-czx87\") pod \"barbican32a4-account-delete-xckl5\" (UID: \"5d5ccc95-6c2c-4f3c-884b-456cf28d6db4\") " pod="openstack/barbican32a4-account-delete-xckl5" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.471706 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.472093 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="ovn-northd" containerID="cri-o://28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" gracePeriod=30 Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.472262 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="openstack-network-exporter" containerID="cri-o://4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083" gracePeriod=30 Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.493259 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xmmfb"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.507490 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czx87\" (UniqueName: \"kubernetes.io/projected/5d5ccc95-6c2c-4f3c-884b-456cf28d6db4-kube-api-access-czx87\") pod \"barbican32a4-account-delete-xckl5\" (UID: \"5d5ccc95-6c2c-4f3c-884b-456cf28d6db4\") " pod="openstack/barbican32a4-account-delete-xckl5" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.508104 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xmmfb"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.537940 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d2ee2741-9417-4698-b550-7c596d00d271" containerName="rabbitmq" containerID="cri-o://a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac" gracePeriod=604800 Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.554972 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.573514 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican32a4-account-delete-xckl5" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.574789 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfh44\" (UniqueName: \"kubernetes.io/projected/622d766f-f43c-434c-9353-2315a6c82ae6-kube-api-access-kfh44\") pod \"novaapi0c9e-account-delete-l4t5r\" (UID: \"622d766f-f43c-434c-9353-2315a6c82ae6\") " pod="openstack/novaapi0c9e-account-delete-l4t5r" Sep 29 19:05:10 crc kubenswrapper[4780]: E0929 19:05:10.575494 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:10 crc kubenswrapper[4780]: E0929 19:05:10.575568 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data podName:b90472c3-a09d-433c-922b-d164a11636e6 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:11.575544934 +0000 UTC m=+1311.523842978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b90472c3-a09d-433c-922b-d164a11636e6") : configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.627434 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b90472c3-a09d-433c-922b-d164a11636e6" containerName="rabbitmq" containerID="cri-o://0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0" gracePeriod=604800 Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.677354 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfh44\" (UniqueName: \"kubernetes.io/projected/622d766f-f43c-434c-9353-2315a6c82ae6-kube-api-access-kfh44\") pod \"novaapi0c9e-account-delete-l4t5r\" (UID: \"622d766f-f43c-434c-9353-2315a6c82ae6\") " pod="openstack/novaapi0c9e-account-delete-l4t5r" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.699064 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.699505 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerName="openstack-network-exporter" containerID="cri-o://1a72c9638b5649fd8982600fc6af41f0dfb4434ab14f6a9fa20981be04918d1c" gracePeriod=300 Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.715462 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell063ac-account-delete-rvt6m"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.716955 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell063ac-account-delete-rvt6m" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.725496 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementb644-account-delete-t2p8t" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.769475 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfh44\" (UniqueName: \"kubernetes.io/projected/622d766f-f43c-434c-9353-2315a6c82ae6-kube-api-access-kfh44\") pod \"novaapi0c9e-account-delete-l4t5r\" (UID: \"622d766f-f43c-434c-9353-2315a6c82ae6\") " pod="openstack/novaapi0c9e-account-delete-l4t5r" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.833314 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi0c9e-account-delete-l4t5r" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.855897 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89293f3-5080-4326-9f2c-7ba9a2f34280" path="/var/lib/kubelet/pods/f89293f3-5080-4326-9f2c-7ba9a2f34280/volumes" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.866812 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-f6rkc"] Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.866873 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-f6rkc"] Sep 29 19:05:10 crc kubenswrapper[4780]: E0929 19:05:10.883064 4780 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-hzb5x" message=< Sep 29 19:05:10 crc kubenswrapper[4780]: Exiting ovn-controller (1) [ OK ] Sep 29 19:05:10 crc kubenswrapper[4780]: > Sep 29 19:05:10 crc kubenswrapper[4780]: E0929 19:05:10.883143 4780 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-hzb5x" podUID="91a8fa86-9475-490a-9c9f-09233413eab5" containerName="ovn-controller" containerID="cri-o://5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.883661 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-hzb5x" podUID="91a8fa86-9475-490a-9c9f-09233413eab5" containerName="ovn-controller" containerID="cri-o://5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd" gracePeriod=29 Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.903812 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmtqv\" (UniqueName: \"kubernetes.io/projected/eed2917c-127a-4dbd-b951-6b141853e47c-kube-api-access-xmtqv\") pod \"novacell063ac-account-delete-rvt6m\" (UID: \"eed2917c-127a-4dbd-b951-6b141853e47c\") " pod="openstack/novacell063ac-account-delete-rvt6m" Sep 29 19:05:10 crc kubenswrapper[4780]: I0929 19:05:10.943800 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell063ac-account-delete-rvt6m"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.002253 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-d9jdv"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.031766 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmtqv\" (UniqueName: \"kubernetes.io/projected/eed2917c-127a-4dbd-b951-6b141853e47c-kube-api-access-xmtqv\") pod \"novacell063ac-account-delete-rvt6m\" (UID: \"eed2917c-127a-4dbd-b951-6b141853e47c\") " pod="openstack/novacell063ac-account-delete-rvt6m" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.062425 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerName="ovsdbserver-sb" containerID="cri-o://554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797" gracePeriod=300 Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.062578 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron33e2-account-delete-nr86j"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.063930 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron33e2-account-delete-nr86j" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.077698 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmtqv\" (UniqueName: \"kubernetes.io/projected/eed2917c-127a-4dbd-b951-6b141853e47c-kube-api-access-xmtqv\") pod \"novacell063ac-account-delete-rvt6m\" (UID: \"eed2917c-127a-4dbd-b951-6b141853e47c\") " pod="openstack/novacell063ac-account-delete-rvt6m" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.170437 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-d9jdv"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.179668 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell063ac-account-delete-rvt6m" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.211928 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pf98d"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.242564 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phdxh\" (UniqueName: \"kubernetes.io/projected/83f061df-a5ff-4db1-b87f-4106a5e56b55-kube-api-access-phdxh\") pod \"neutron33e2-account-delete-nr86j\" (UID: \"83f061df-a5ff-4db1-b87f-4106a5e56b55\") " pod="openstack/neutron33e2-account-delete-nr86j" Sep 29 19:05:11 crc kubenswrapper[4780]: E0929 19:05:11.242769 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 29 19:05:11 crc kubenswrapper[4780]: E0929 19:05:11.242835 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data podName:d2ee2741-9417-4698-b550-7c596d00d271 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:13.242813991 +0000 UTC m=+1313.191112035 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data") pod "rabbitmq-server-0" (UID: "d2ee2741-9417-4698-b550-7c596d00d271") : configmap "rabbitmq-config-data" not found Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.263446 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pf98d"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.355407 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phdxh\" (UniqueName: \"kubernetes.io/projected/83f061df-a5ff-4db1-b87f-4106a5e56b55-kube-api-access-phdxh\") pod \"neutron33e2-account-delete-nr86j\" (UID: \"83f061df-a5ff-4db1-b87f-4106a5e56b55\") " pod="openstack/neutron33e2-account-delete-nr86j" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.367117 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron33e2-account-delete-nr86j"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.414311 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phdxh\" (UniqueName: \"kubernetes.io/projected/83f061df-a5ff-4db1-b87f-4106a5e56b55-kube-api-access-phdxh\") pod \"neutron33e2-account-delete-nr86j\" (UID: \"83f061df-a5ff-4db1-b87f-4106a5e56b55\") " pod="openstack/neutron33e2-account-delete-nr86j" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.466767 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8vsrs_00f3e1c2-9a7e-42d1-8aa8-396285ea40c8/openstack-network-exporter/0.log" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.466840 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.467990 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-metrics-certs-tls-certs\") pod \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.468082 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-config\") pod \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.468107 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhw6r\" (UniqueName: \"kubernetes.io/projected/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-kube-api-access-lhw6r\") pod \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.468129 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovn-rundir\") pod \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.468225 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovs-rundir\") pod \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.468329 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-combined-ca-bundle\") pod \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\" (UID: \"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8\") " Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.469075 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" (UID: "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.469380 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" (UID: "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.472105 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-config" (OuterVolumeSpecName: "config") pod "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" (UID: "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.486341 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vg5kt"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.495466 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-kube-api-access-lhw6r" (OuterVolumeSpecName: "kube-api-access-lhw6r") pod "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" (UID: "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8"). InnerVolumeSpecName "kube-api-access-lhw6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.527662 4780 generic.go:334] "Generic (PLEG): container finished" podID="3683c554-eec7-4825-8972-0445faf15a23" containerID="4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083" exitCode=2 Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.527799 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3683c554-eec7-4825-8972-0445faf15a23","Type":"ContainerDied","Data":"4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083"} Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.553464 4780 generic.go:334] "Generic (PLEG): container finished" podID="91a8fa86-9475-490a-9c9f-09233413eab5" containerID="5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd" exitCode=0 Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.553617 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x" event={"ID":"91a8fa86-9475-490a-9c9f-09233413eab5","Type":"ContainerDied","Data":"5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd"} Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.590250 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_62b9c388-0f74-42fc-bf3d-711322b976d8/ovsdbserver-sb/0.log" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.590315 4780 generic.go:334] "Generic (PLEG): container finished" podID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerID="1a72c9638b5649fd8982600fc6af41f0dfb4434ab14f6a9fa20981be04918d1c" exitCode=2 Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.590343 4780 generic.go:334] "Generic (PLEG): container finished" podID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerID="554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797" exitCode=143 Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.590482 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"62b9c388-0f74-42fc-bf3d-711322b976d8","Type":"ContainerDied","Data":"1a72c9638b5649fd8982600fc6af41f0dfb4434ab14f6a9fa20981be04918d1c"} Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.590526 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"62b9c388-0f74-42fc-bf3d-711322b976d8","Type":"ContainerDied","Data":"554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797"} Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.592610 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.592636 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhw6r\" (UniqueName: \"kubernetes.io/projected/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-kube-api-access-lhw6r\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.592651 4780 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovn-rundir\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.592660 4780 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-ovs-rundir\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:11 crc kubenswrapper[4780]: E0929 19:05:11.592728 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:11 crc kubenswrapper[4780]: E0929 19:05:11.598413 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data podName:b90472c3-a09d-433c-922b-d164a11636e6 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:13.592765412 +0000 UTC m=+1313.541063456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b90472c3-a09d-433c-922b-d164a11636e6") : configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.606993 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8vsrs_00f3e1c2-9a7e-42d1-8aa8-396285ea40c8/openstack-network-exporter/0.log" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.607104 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8vsrs" event={"ID":"00f3e1c2-9a7e-42d1-8aa8-396285ea40c8","Type":"ContainerDied","Data":"aa20fc20ab16ba8887da12486113184683f526c2c90d5e16275cee20954886fa"} Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.607149 4780 scope.go:117] "RemoveContainer" containerID="13a5524647bbab0fbbfe370f379b83106caeed1a42a7ccefb228fba784b1ddf7" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.607347 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.621532 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vg5kt"] Sep 29 19:05:11 crc kubenswrapper[4780]: E0929 19:05:11.628016 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797 is running failed: container process not found" containerID="554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797" cmd=["/usr/bin/pidof","ovsdb-server"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.636742 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.637393 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" containerName="openstack-network-exporter" containerID="cri-o://ca3350f3db78178478e71ece3c7e24a200961466f9224460cb27c414e7b48f42" gracePeriod=300 Sep 29 19:05:11 crc kubenswrapper[4780]: E0929 19:05:11.647320 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797 is running failed: container process not found" containerID="554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797" cmd=["/usr/bin/pidof","ovsdb-server"] Sep 29 19:05:11 crc kubenswrapper[4780]: E0929 19:05:11.653418 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797 is running failed: container process not found" containerID="554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797" cmd=["/usr/bin/pidof","ovsdb-server"] Sep 29 19:05:11 crc kubenswrapper[4780]: E0929 19:05:11.653902 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerName="ovsdbserver-sb" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.658365 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-btjpk"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.658686 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron33e2-account-delete-nr86j" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.669870 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" (UID: "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.702529 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.713712 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-btjpk"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.751228 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" (UID: "00f3e1c2-9a7e-42d1-8aa8-396285ea40c8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.781387 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" containerName="ovsdbserver-nb" containerID="cri-o://bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce" gracePeriod=300 Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.825155 4780 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.957665 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glancefda4-account-delete-4ndzp"] Sep 29 19:05:11 crc kubenswrapper[4780]: E0929 19:05:11.958140 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" containerName="openstack-network-exporter" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.958160 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" containerName="openstack-network-exporter" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.958383 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" containerName="openstack-network-exporter" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.959331 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancefda4-account-delete-4ndzp" Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.976828 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancefda4-account-delete-4ndzp"] Sep 29 19:05:11 crc kubenswrapper[4780]: I0929 19:05:11.992234 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rvtnb"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.013820 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.014289 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-server" containerID="cri-o://00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.014737 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="swift-recon-cron" containerID="cri-o://ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.014795 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="rsync" containerID="cri-o://7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.014832 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-expirer" containerID="cri-o://967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.014862 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-updater" containerID="cri-o://a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.014892 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-auditor" containerID="cri-o://2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.014927 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-replicator" containerID="cri-o://3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.014959 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-server" containerID="cri-o://ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.014996 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-updater" containerID="cri-o://f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.015028 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-auditor" containerID="cri-o://ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.016375 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-replicator" containerID="cri-o://31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.016445 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-server" containerID="cri-o://28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.016481 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-reaper" containerID="cri-o://229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.016517 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-auditor" containerID="cri-o://87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.016551 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-replicator" containerID="cri-o://c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.035437 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w78j7\" (UniqueName: \"kubernetes.io/projected/f271f9ca-bced-4144-b779-06e7422d9a63-kube-api-access-w78j7\") pod \"glancefda4-account-delete-4ndzp\" (UID: \"f271f9ca-bced-4144-b779-06e7422d9a63\") " pod="openstack/glancefda4-account-delete-4ndzp" Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.042318 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rvtnb"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.055122 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-jhslf"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.056279 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" podUID="7373591d-cf39-4674-8b37-449096f6a3b6" containerName="dnsmasq-dns" containerID="cri-o://285ff896c42d36d4f725e2aad71a176bed9a283b3fe4f54c742f19da6fd34e81" gracePeriod=10 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.073350 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rhnjt"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.087445 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rhnjt"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.098190 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5f568c9c76-zb5pj"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.098841 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5f568c9c76-zb5pj" podUID="6105150b-678d-4925-a981-9a0d75377f32" containerName="placement-api" containerID="cri-o://0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.098482 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5f568c9c76-zb5pj" podUID="6105150b-678d-4925-a981-9a0d75377f32" containerName="placement-log" containerID="cri-o://eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.140847 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w78j7\" (UniqueName: \"kubernetes.io/projected/f271f9ca-bced-4144-b779-06e7422d9a63-kube-api-access-w78j7\") pod \"glancefda4-account-delete-4ndzp\" (UID: \"f271f9ca-bced-4144-b779-06e7422d9a63\") " pod="openstack/glancefda4-account-delete-4ndzp" Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.157390 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.193099 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w78j7\" (UniqueName: \"kubernetes.io/projected/f271f9ca-bced-4144-b779-06e7422d9a63-kube-api-access-w78j7\") pod \"glancefda4-account-delete-4ndzp\" (UID: \"f271f9ca-bced-4144-b779-06e7422d9a63\") " pod="openstack/glancefda4-account-delete-4ndzp" Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.193272 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" containerID="cri-o://d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" gracePeriod=28 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.203258 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b644-account-create-4hb8n"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.211447 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4slf2"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.217878 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b644-account-create-4hb8n"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.224491 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4slf2"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.233541 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementb644-account-delete-t2p8t"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.265649 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.266256 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-log" containerID="cri-o://aae9731ab2bae2a8e2eb268dc27032c196b1db8b2299a7d349eab203f6ba9217" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.276769 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-api" containerID="cri-o://3f91861bf876fab40cbb103b723d6d21a4fb3a2aeadedbbda6de0035a6ee2aa7" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.348111 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d954bbbf5-jklnq"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.376330 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d954bbbf5-jklnq" podUID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerName="neutron-api" containerID="cri-o://3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.376764 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d954bbbf5-jklnq" podUID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerName="neutron-httpd" containerID="cri-o://3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92" gracePeriod=30 Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.413653 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-2dxq9"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.468213 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-48be-account-create-sjthn"] Sep 29 19:05:12 crc kubenswrapper[4780]: I0929 19:05:12.512249 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-2dxq9"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.548880 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-48be-account-create-sjthn"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.582120 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.582383 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ec846e3f-c11b-4818-a15b-9f855ed48a56" containerName="nova-scheduler-scheduler" containerID="cri-o://06b644ef5b1ab2aed1b81290fa9144d38c32c66e7d427c70b6dfb41dd252e0ac" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.656729 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.657430 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerName="glance-log" containerID="cri-o://22c83df1dfa900462fb0bdf93010df94b1f1fbd660599f3ce6a52119f57afbe9" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.657968 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerName="glance-httpd" containerID="cri-o://eeee69b0a809e51c2de8aae84184f344369f4e4f6fab7ebfa4f65f602565ed13" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.659190 4780 generic.go:334] "Generic (PLEG): container finished" podID="3608c7b9-1f29-491f-9a10-48135b074fa4" containerID="f9639917e8369f971bbda3bb865e49a2021d379743b2c56bab33019529c9a847" exitCode=137 Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:12.683912 4780 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Sep 29 19:05:13 crc kubenswrapper[4780]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Sep 29 19:05:13 crc kubenswrapper[4780]: + source /usr/local/bin/container-scripts/functions Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNBridge=br-int Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNRemote=tcp:localhost:6642 Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNEncapType=geneve Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNAvailabilityZones= Sep 29 19:05:13 crc kubenswrapper[4780]: ++ EnableChassisAsGateway=true Sep 29 19:05:13 crc kubenswrapper[4780]: ++ PhysicalNetworks= Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNHostName= Sep 29 19:05:13 crc kubenswrapper[4780]: ++ DB_FILE=/etc/openvswitch/conf.db Sep 29 19:05:13 crc kubenswrapper[4780]: ++ ovs_dir=/var/lib/openvswitch Sep 29 19:05:13 crc kubenswrapper[4780]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Sep 29 19:05:13 crc kubenswrapper[4780]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Sep 29 19:05:13 crc kubenswrapper[4780]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + sleep 0.5 Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + sleep 0.5 Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + sleep 0.5 Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + cleanup_ovsdb_server_semaphore Sep 29 19:05:13 crc kubenswrapper[4780]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 29 19:05:13 crc kubenswrapper[4780]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Sep 29 19:05:13 crc kubenswrapper[4780]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-tqkx6" message=< Sep 29 19:05:13 crc kubenswrapper[4780]: Exiting ovsdb-server (5) [ OK ] Sep 29 19:05:13 crc kubenswrapper[4780]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Sep 29 19:05:13 crc kubenswrapper[4780]: + source /usr/local/bin/container-scripts/functions Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNBridge=br-int Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNRemote=tcp:localhost:6642 Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNEncapType=geneve Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNAvailabilityZones= Sep 29 19:05:13 crc kubenswrapper[4780]: ++ EnableChassisAsGateway=true Sep 29 19:05:13 crc kubenswrapper[4780]: ++ PhysicalNetworks= Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNHostName= Sep 29 19:05:13 crc kubenswrapper[4780]: ++ DB_FILE=/etc/openvswitch/conf.db Sep 29 19:05:13 crc kubenswrapper[4780]: ++ ovs_dir=/var/lib/openvswitch Sep 29 19:05:13 crc kubenswrapper[4780]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Sep 29 19:05:13 crc kubenswrapper[4780]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Sep 29 19:05:13 crc kubenswrapper[4780]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + sleep 0.5 Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + sleep 0.5 Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + sleep 0.5 Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + cleanup_ovsdb_server_semaphore Sep 29 19:05:13 crc kubenswrapper[4780]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 29 19:05:13 crc kubenswrapper[4780]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Sep 29 19:05:13 crc kubenswrapper[4780]: > Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:12.683962 4780 kuberuntime_container.go:691] "PreStop hook failed" err=< Sep 29 19:05:13 crc kubenswrapper[4780]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Sep 29 19:05:13 crc kubenswrapper[4780]: + source /usr/local/bin/container-scripts/functions Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNBridge=br-int Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNRemote=tcp:localhost:6642 Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNEncapType=geneve Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNAvailabilityZones= Sep 29 19:05:13 crc kubenswrapper[4780]: ++ EnableChassisAsGateway=true Sep 29 19:05:13 crc kubenswrapper[4780]: ++ PhysicalNetworks= Sep 29 19:05:13 crc kubenswrapper[4780]: ++ OVNHostName= Sep 29 19:05:13 crc kubenswrapper[4780]: ++ DB_FILE=/etc/openvswitch/conf.db Sep 29 19:05:13 crc kubenswrapper[4780]: ++ ovs_dir=/var/lib/openvswitch Sep 29 19:05:13 crc kubenswrapper[4780]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Sep 29 19:05:13 crc kubenswrapper[4780]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Sep 29 19:05:13 crc kubenswrapper[4780]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + sleep 0.5 Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + sleep 0.5 Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + sleep 0.5 Sep 29 19:05:13 crc kubenswrapper[4780]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Sep 29 19:05:13 crc kubenswrapper[4780]: + cleanup_ovsdb_server_semaphore Sep 29 19:05:13 crc kubenswrapper[4780]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Sep 29 19:05:13 crc kubenswrapper[4780]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Sep 29 19:05:13 crc kubenswrapper[4780]: > pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" containerID="cri-o://0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.684006 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" containerID="cri-o://0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" gracePeriod=28 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.717839 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ws6nx"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.722126 4780 generic.go:334] "Generic (PLEG): container finished" podID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerID="dbcf9928788092ab977ae712ae4612aa67a6f7d49fb7301d908346b1aca4b563" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.722153 4780 generic.go:334] "Generic (PLEG): container finished" podID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerID="2bed75a72ebe1e8129dcb4991c90de2501dc80eaaa5a5d00e170c5bcd8aefd4f" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.722225 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2903cdd8-3ab5-4c85-892c-2139eb0bde7c","Type":"ContainerDied","Data":"dbcf9928788092ab977ae712ae4612aa67a6f7d49fb7301d908346b1aca4b563"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.722252 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2903cdd8-3ab5-4c85-892c-2139eb0bde7c","Type":"ContainerDied","Data":"2bed75a72ebe1e8129dcb4991c90de2501dc80eaaa5a5d00e170c5bcd8aefd4f"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.724447 4780 generic.go:334] "Generic (PLEG): container finished" podID="6105150b-678d-4925-a981-9a0d75377f32" containerID="eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1" exitCode=143 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.724481 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f568c9c76-zb5pj" event={"ID":"6105150b-678d-4925-a981-9a0d75377f32","Type":"ContainerDied","Data":"eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.726123 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8611dff0-9ad1-4bba-b687-958d7e887859/ovsdbserver-nb/0.log" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.726150 4780 generic.go:334] "Generic (PLEG): container finished" podID="8611dff0-9ad1-4bba-b687-958d7e887859" containerID="ca3350f3db78178478e71ece3c7e24a200961466f9224460cb27c414e7b48f42" exitCode=2 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.726162 4780 generic.go:334] "Generic (PLEG): container finished" podID="8611dff0-9ad1-4bba-b687-958d7e887859" containerID="bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce" exitCode=143 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.726191 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8611dff0-9ad1-4bba-b687-958d7e887859","Type":"ContainerDied","Data":"ca3350f3db78178478e71ece3c7e24a200961466f9224460cb27c414e7b48f42"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.726209 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8611dff0-9ad1-4bba-b687-958d7e887859","Type":"ContainerDied","Data":"bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.728915 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ws6nx"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:12.738759 4780 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.755105 4780 generic.go:334] "Generic (PLEG): container finished" podID="7373591d-cf39-4674-8b37-449096f6a3b6" containerID="285ff896c42d36d4f725e2aad71a176bed9a283b3fe4f54c742f19da6fd34e81" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803136 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803168 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803175 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803184 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803191 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803198 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803204 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803211 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803218 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.803224 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:12.929945 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" podUID="7373591d-cf39-4674-8b37-449096f6a3b6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.198:5353: connect: connection refused" Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:12.935400 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce is running failed: container process not found" containerID="bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce" cmd=["/usr/bin/pidof","ovsdb-server"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:12.943076 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce is running failed: container process not found" containerID="bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce" cmd=["/usr/bin/pidof","ovsdb-server"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:12.943867 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce is running failed: container process not found" containerID="bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce" cmd=["/usr/bin/pidof","ovsdb-server"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:12.943901 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" containerName="ovsdbserver-nb" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.221884 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fb3ff6-7863-4831-9720-a4665c09dc82" path="/var/lib/kubelet/pods/15fb3ff6-7863-4831-9720-a4665c09dc82/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.224167 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fa1e2b-3a9e-4e88-97e6-9751eb595e01" path="/var/lib/kubelet/pods/19fa1e2b-3a9e-4e88-97e6-9751eb595e01/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.225103 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc9cbc9-e264-4eba-81ff-38dbb8b126ad" path="/var/lib/kubelet/pods/1fc9cbc9-e264-4eba-81ff-38dbb8b126ad/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.227951 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2603fb51-f7e5-4212-a85a-2411175cd5d7" path="/var/lib/kubelet/pods/2603fb51-f7e5-4212-a85a-2411175cd5d7/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.230785 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="334ed004-9824-4be5-bf0c-027315c0bc82" path="/var/lib/kubelet/pods/334ed004-9824-4be5-bf0c-027315c0bc82/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.231617 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3723c568-a926-469d-bda8-99c2a0ed7095" path="/var/lib/kubelet/pods/3723c568-a926-469d-bda8-99c2a0ed7095/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.232980 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a3341c3-4401-4e61-aa4b-58943632c521" path="/var/lib/kubelet/pods/5a3341c3-4401-4e61-aa4b-58943632c521/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.234415 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f7fccbf-70e9-4d7c-9915-97026e49e6b0" path="/var/lib/kubelet/pods/8f7fccbf-70e9-4d7c-9915-97026e49e6b0/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.235011 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90114ff5-5dc3-4755-be92-df3f1f7a12f0" path="/var/lib/kubelet/pods/90114ff5-5dc3-4755-be92-df3f1f7a12f0/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.235653 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d3cfe6-96f0-442a-aa5d-8a08ff10eed1" path="/var/lib/kubelet/pods/91d3cfe6-96f0-442a-aa5d-8a08ff10eed1/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.236551 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1848d52-c01b-4618-bbf7-777cc63f0544" path="/var/lib/kubelet/pods/d1848d52-c01b-4618-bbf7-777cc63f0544/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.239125 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4fd579e-b8e6-4845-b5fd-b9291fe94829" path="/var/lib/kubelet/pods/f4fd579e-b8e6-4845-b5fd-b9291fe94829/volumes" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.240002 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-32a4-account-create-dsqns"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.253268 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-32a4-account-create-dsqns"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.253707 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" event={"ID":"7373591d-cf39-4674-8b37-449096f6a3b6","Type":"ContainerDied","Data":"285ff896c42d36d4f725e2aad71a176bed9a283b3fe4f54c742f19da6fd34e81"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.253882 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.253899 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254067 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254085 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254098 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican32a4-account-delete-xckl5"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254118 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254168 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254185 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254228 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254250 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254269 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-79b866b5dd-2f72g"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254282 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-74988cff4c-fmczd"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254364 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254378 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58hpf"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254388 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254398 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254412 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-58hpf"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254448 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254460 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-4dm25"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254471 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a5b9-account-create-9sbx9"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254480 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254493 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6764d576f6-q7trv"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254507 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a5b9-account-create-9sbx9"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254534 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-4dm25"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254545 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254557 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n48tz"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254566 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-dc4k2"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254575 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-63ac-account-create-l5pps"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254585 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-dc4k2"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254596 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-63ac-account-create-l5pps"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254609 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n48tz"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254620 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rwgns"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254630 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-33e2-account-create-zvxcq"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254640 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rwgns"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254649 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell063ac-account-delete-rvt6m"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.254954 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="bc401926-3969-448c-9910-22572fecb168" containerName="nova-cell1-conductor-conductor" containerID="cri-o://1bf3800786032f687dfb373cbc1d24ace1919441397847f347217bf7a840db61" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.255208 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-log" containerID="cri-o://a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.256152 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerName="glance-log" containerID="cri-o://02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.256322 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-74988cff4c-fmczd" podUID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerName="barbican-worker-log" containerID="cri-o://4b807f34a3c65b6d836e3bd255f8320430de3cf2180ee8e33b572ba6e6717b3b" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.256471 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6764d576f6-q7trv" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api-log" containerID="cri-o://0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.256520 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-74988cff4c-fmczd" podUID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerName="barbican-worker" containerID="cri-o://b7c50fc2d9534221112ba9758fee8b52356d9efe5f5ed3fb8c0432498719f180" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.256496 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-metadata" containerID="cri-o://7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.256913 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6764d576f6-q7trv" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api" containerID="cri-o://968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.257004 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a1f2aaf8-27dc-428c-a387-d63424889230" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.257084 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" podUID="8e1d2b75-0893-468d-8365-f08fa8875575" containerName="barbican-keystone-listener-log" containerID="cri-o://4cac85a12f0ad40a5dc9410707339b2f4a75fbc7d7e9f99310b24f564e2e4f03" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.257127 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" podUID="8e1d2b75-0893-468d-8365-f08fa8875575" containerName="barbican-keystone-listener" containerID="cri-o://a3bd0c44347129dd3eb6a433ef0ca6e0cd25372b1057012a51a9c55da4d8ff4a" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.257148 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerName="glance-httpd" containerID="cri-o://724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.271140 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" containerName="nova-cell0-conductor-conductor" containerID="cri-o://739143154f41eccfb13a2b48adb19e687f9f167c8167b59c2ccf652c349ef90e" gracePeriod=30 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.313649 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancefda4-account-delete-4ndzp"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.314885 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.314943 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data podName:d2ee2741-9417-4698-b550-7c596d00d271 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:17.314922392 +0000 UTC m=+1317.263220436 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data") pod "rabbitmq-server-0" (UID: "d2ee2741-9417-4698-b550-7c596d00d271") : configmap "rabbitmq-config-data" not found Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.348938 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-h6w2d"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.392687 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron33e2-account-delete-nr86j"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.438860 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-33e2-account-create-zvxcq"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.459362 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-h6w2d"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.469822 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="628b549e-6d99-43d4-94bb-61b457f4c37b" containerName="galera" containerID="cri-o://ff8a529133b59522aa5a47a19801e5fe0c76dbf90cf9186ffe730d3e74db9aba" gracePeriod=29 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.480292 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-fda4-account-create-xncgz"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.508846 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-fda4-account-create-xncgz"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.528695 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican32a4-account-delete-xckl5"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.608940 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd is running failed: container process not found" containerID="5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.612350 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd is running failed: container process not found" containerID="5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.616100 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd is running failed: container process not found" containerID="5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.616162 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-hzb5x" podUID="91a8fa86-9475-490a-9c9f-09233413eab5" containerName="ovn-controller" Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.642292 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.642392 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data podName:b90472c3-a09d-433c-922b-d164a11636e6 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:17.642370004 +0000 UTC m=+1317.590668048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b90472c3-a09d-433c-922b-d164a11636e6") : configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.720799 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.735283 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.735456 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.735566 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.739418 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.739614 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.739651 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="ovn-northd" Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.743361 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.746190 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.743955 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.752673 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:13 crc kubenswrapper[4780]: E0929 19:05:13.753277 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.796854 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancefda4-account-delete-4ndzp" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.807916 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron33e2-account-delete-nr86j"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.829846 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.848562 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91a8fa86-9475-490a-9c9f-09233413eab5-scripts\") pod \"91a8fa86-9475-490a-9c9f-09233413eab5\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.848603 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc8z8\" (UniqueName: \"kubernetes.io/projected/91a8fa86-9475-490a-9c9f-09233413eab5-kube-api-access-kc8z8\") pod \"91a8fa86-9475-490a-9c9f-09233413eab5\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.849125 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-combined-ca-bundle\") pod \"91a8fa86-9475-490a-9c9f-09233413eab5\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.849259 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run\") pod \"91a8fa86-9475-490a-9c9f-09233413eab5\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.849288 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-log-ovn\") pod \"91a8fa86-9475-490a-9c9f-09233413eab5\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.849307 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run-ovn\") pod \"91a8fa86-9475-490a-9c9f-09233413eab5\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.849358 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-ovn-controller-tls-certs\") pod \"91a8fa86-9475-490a-9c9f-09233413eab5\" (UID: \"91a8fa86-9475-490a-9c9f-09233413eab5\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.854138 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91a8fa86-9475-490a-9c9f-09233413eab5-scripts" (OuterVolumeSpecName: "scripts") pod "91a8fa86-9475-490a-9c9f-09233413eab5" (UID: "91a8fa86-9475-490a-9c9f-09233413eab5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.855142 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell063ac-account-delete-rvt6m"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.855317 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run" (OuterVolumeSpecName: "var-run") pod "91a8fa86-9475-490a-9c9f-09233413eab5" (UID: "91a8fa86-9475-490a-9c9f-09233413eab5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.857011 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "91a8fa86-9475-490a-9c9f-09233413eab5" (UID: "91a8fa86-9475-490a-9c9f-09233413eab5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.872526 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.857202 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "91a8fa86-9475-490a-9c9f-09233413eab5" (UID: "91a8fa86-9475-490a-9c9f-09233413eab5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.883069 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementb644-account-delete-t2p8t"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.886339 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a8fa86-9475-490a-9c9f-09233413eab5-kube-api-access-kc8z8" (OuterVolumeSpecName: "kube-api-access-kc8z8") pod "91a8fa86-9475-490a-9c9f-09233413eab5" (UID: "91a8fa86-9475-490a-9c9f-09233413eab5"). InnerVolumeSpecName "kube-api-access-kc8z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.895828 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.895872 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.895882 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.895892 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.895932 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.895957 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.895968 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.895977 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.909297 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_62b9c388-0f74-42fc-bf3d-711322b976d8/ovsdbserver-sb/0.log" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.915802 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi0c9e-account-delete-l4t5r"] Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.919879 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.921118 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2903cdd8-3ab5-4c85-892c-2139eb0bde7c","Type":"ContainerDied","Data":"aa91dbe2bcb995257ece6a558d9d8700ad7320f1ed8d1df9fc1b87bfe9c9a0f4"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.921198 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa91dbe2bcb995257ece6a558d9d8700ad7320f1ed8d1df9fc1b87bfe9c9a0f4" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.941191 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.982357 4780 generic.go:334] "Generic (PLEG): container finished" podID="3c91af49-2adc-47a1-892c-82da3b338492" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" exitCode=0 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.982498 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqkx6" event={"ID":"3c91af49-2adc-47a1-892c-82da3b338492","Type":"ContainerDied","Data":"0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea"} Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.984840 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8611dff0-9ad1-4bba-b687-958d7e887859/ovsdbserver-nb/0.log" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.984986 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.987859 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.988442 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-combined-ca-bundle\") pod \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.988539 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-combined-ca-bundle\") pod \"3608c7b9-1f29-491f-9a10-48135b074fa4\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.988584 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data\") pod \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.988649 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdbserver-sb-tls-certs\") pod \"62b9c388-0f74-42fc-bf3d-711322b976d8\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.988760 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config\") pod \"3608c7b9-1f29-491f-9a10-48135b074fa4\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.988847 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data-custom\") pod \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.988911 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-scripts\") pod \"62b9c388-0f74-42fc-bf3d-711322b976d8\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.988989 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqknp\" (UniqueName: \"kubernetes.io/projected/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-kube-api-access-wqknp\") pod \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989106 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c55lh\" (UniqueName: \"kubernetes.io/projected/3608c7b9-1f29-491f-9a10-48135b074fa4-kube-api-access-c55lh\") pod \"3608c7b9-1f29-491f-9a10-48135b074fa4\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989148 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config-secret\") pod \"3608c7b9-1f29-491f-9a10-48135b074fa4\" (UID: \"3608c7b9-1f29-491f-9a10-48135b074fa4\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989243 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-metrics-certs-tls-certs\") pod \"62b9c388-0f74-42fc-bf3d-711322b976d8\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989291 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-combined-ca-bundle\") pod \"62b9c388-0f74-42fc-bf3d-711322b976d8\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989387 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdxqq\" (UniqueName: \"kubernetes.io/projected/62b9c388-0f74-42fc-bf3d-711322b976d8-kube-api-access-wdxqq\") pod \"62b9c388-0f74-42fc-bf3d-711322b976d8\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989462 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-etc-machine-id\") pod \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989508 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-scripts\") pod \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\" (UID: \"2903cdd8-3ab5-4c85-892c-2139eb0bde7c\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989570 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-config\") pod \"62b9c388-0f74-42fc-bf3d-711322b976d8\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989613 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"62b9c388-0f74-42fc-bf3d-711322b976d8\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.989677 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdb-rundir\") pod \"62b9c388-0f74-42fc-bf3d-711322b976d8\" (UID: \"62b9c388-0f74-42fc-bf3d-711322b976d8\") " Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.993558 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "62b9c388-0f74-42fc-bf3d-711322b976d8" (UID: "62b9c388-0f74-42fc-bf3d-711322b976d8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.993673 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2903cdd8-3ab5-4c85-892c-2139eb0bde7c" (UID: "2903cdd8-3ab5-4c85-892c-2139eb0bde7c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.995013 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91a8fa86-9475-490a-9c9f-09233413eab5-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.995122 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc8z8\" (UniqueName: \"kubernetes.io/projected/91a8fa86-9475-490a-9c9f-09233413eab5-kube-api-access-kc8z8\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.995142 4780 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.995160 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.995175 4780 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.995185 4780 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.995196 4780 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91a8fa86-9475-490a-9c9f-09233413eab5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.995607 4780 generic.go:334] "Generic (PLEG): container finished" podID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerID="a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a" exitCode=143 Sep 29 19:05:13 crc kubenswrapper[4780]: I0929 19:05:13.995711 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248","Type":"ContainerDied","Data":"a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.000963 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-config" (OuterVolumeSpecName: "config") pod "62b9c388-0f74-42fc-bf3d-711322b976d8" (UID: "62b9c388-0f74-42fc-bf3d-711322b976d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.015410 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican32a4-account-delete-xckl5" event={"ID":"5d5ccc95-6c2c-4f3c-884b-456cf28d6db4","Type":"ContainerStarted","Data":"c0623af1e08421a5875b41b1bfe12d73958a41aa399f6f1be9762530d51b1405"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.020832 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-scripts" (OuterVolumeSpecName: "scripts") pod "62b9c388-0f74-42fc-bf3d-711322b976d8" (UID: "62b9c388-0f74-42fc-bf3d-711322b976d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.025782 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" event={"ID":"7373591d-cf39-4674-8b37-449096f6a3b6","Type":"ContainerDied","Data":"a83ca5c714827d12697ce7c6aaefa49e88866902ac8d9a6be22ad403eab3e532"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.025862 4780 scope.go:117] "RemoveContainer" containerID="285ff896c42d36d4f725e2aad71a176bed9a283b3fe4f54c742f19da6fd34e81" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.026133 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cc449b9dc-jhslf" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.033598 4780 generic.go:334] "Generic (PLEG): container finished" podID="8e1d2b75-0893-468d-8365-f08fa8875575" containerID="4cac85a12f0ad40a5dc9410707339b2f4a75fbc7d7e9f99310b24f564e2e4f03" exitCode=143 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.034086 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" event={"ID":"8e1d2b75-0893-468d-8365-f08fa8875575","Type":"ContainerDied","Data":"4cac85a12f0ad40a5dc9410707339b2f4a75fbc7d7e9f99310b24f564e2e4f03"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.046263 4780 generic.go:334] "Generic (PLEG): container finished" podID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerID="02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150" exitCode=143 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.046530 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7f300da-65dd-4c6e-ae4a-63b797768651","Type":"ContainerDied","Data":"02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.054086 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hzb5x" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.054085 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hzb5x" event={"ID":"91a8fa86-9475-490a-9c9f-09233413eab5","Type":"ContainerDied","Data":"be23bffc337a1f4b1dc6d57e5dda65612b10165d95d04f10642bbea383becb7f"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.063218 4780 generic.go:334] "Generic (PLEG): container finished" podID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerID="0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1" exitCode=143 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.063309 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6764d576f6-q7trv" event={"ID":"6c538b0f-23b3-440d-9775-5f33f7badfd4","Type":"ContainerDied","Data":"0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.069325 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_62b9c388-0f74-42fc-bf3d-711322b976d8/ovsdbserver-sb/0.log" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.069426 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"62b9c388-0f74-42fc-bf3d-711322b976d8","Type":"ContainerDied","Data":"849a94a4e4201ed620437ffed48bceb3e1f3ed245c6c02acd8932d69d398ccba"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.069657 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.071837 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b9c388-0f74-42fc-bf3d-711322b976d8-kube-api-access-wdxqq" (OuterVolumeSpecName: "kube-api-access-wdxqq") pod "62b9c388-0f74-42fc-bf3d-711322b976d8" (UID: "62b9c388-0f74-42fc-bf3d-711322b976d8"). InnerVolumeSpecName "kube-api-access-wdxqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.079508 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3608c7b9-1f29-491f-9a10-48135b074fa4-kube-api-access-c55lh" (OuterVolumeSpecName: "kube-api-access-c55lh") pod "3608c7b9-1f29-491f-9a10-48135b074fa4" (UID: "3608c7b9-1f29-491f-9a10-48135b074fa4"). InnerVolumeSpecName "kube-api-access-c55lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.080417 4780 generic.go:334] "Generic (PLEG): container finished" podID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerID="3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92" exitCode=0 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.080567 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d954bbbf5-jklnq" event={"ID":"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8","Type":"ContainerDied","Data":"3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.089787 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-scripts" (OuterVolumeSpecName: "scripts") pod "2903cdd8-3ab5-4c85-892c-2139eb0bde7c" (UID: "2903cdd8-3ab5-4c85-892c-2139eb0bde7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.093913 4780 generic.go:334] "Generic (PLEG): container finished" podID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerID="22c83df1dfa900462fb0bdf93010df94b1f1fbd660599f3ce6a52119f57afbe9" exitCode=143 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.093996 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e14f7a20-d45e-4662-b0db-4af394c7daed","Type":"ContainerDied","Data":"22c83df1dfa900462fb0bdf93010df94b1f1fbd660599f3ce6a52119f57afbe9"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.098428 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-config\") pod \"7373591d-cf39-4674-8b37-449096f6a3b6\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.098495 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-combined-ca-bundle\") pod \"8611dff0-9ad1-4bba-b687-958d7e887859\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.098573 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-scripts\") pod \"8611dff0-9ad1-4bba-b687-958d7e887859\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.098626 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt6kq\" (UniqueName: \"kubernetes.io/projected/7373591d-cf39-4674-8b37-449096f6a3b6-kube-api-access-tt6kq\") pod \"7373591d-cf39-4674-8b37-449096f6a3b6\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.098652 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76bh8\" (UniqueName: \"kubernetes.io/projected/8611dff0-9ad1-4bba-b687-958d7e887859-kube-api-access-76bh8\") pod \"8611dff0-9ad1-4bba-b687-958d7e887859\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.098690 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdbserver-nb-tls-certs\") pod \"8611dff0-9ad1-4bba-b687-958d7e887859\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.098795 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-nb\") pod \"7373591d-cf39-4674-8b37-449096f6a3b6\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.098828 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-swift-storage-0\") pod \"7373591d-cf39-4674-8b37-449096f6a3b6\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.098846 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-config\") pod \"8611dff0-9ad1-4bba-b687-958d7e887859\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.099700 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"8611dff0-9ad1-4bba-b687-958d7e887859\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.099808 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-sb\") pod \"7373591d-cf39-4674-8b37-449096f6a3b6\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.099837 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdb-rundir\") pod \"8611dff0-9ad1-4bba-b687-958d7e887859\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.099864 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-metrics-certs-tls-certs\") pod \"8611dff0-9ad1-4bba-b687-958d7e887859\" (UID: \"8611dff0-9ad1-4bba-b687-958d7e887859\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.099887 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-svc\") pod \"7373591d-cf39-4674-8b37-449096f6a3b6\" (UID: \"7373591d-cf39-4674-8b37-449096f6a3b6\") " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.100529 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c55lh\" (UniqueName: \"kubernetes.io/projected/3608c7b9-1f29-491f-9a10-48135b074fa4-kube-api-access-c55lh\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.100547 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdxqq\" (UniqueName: \"kubernetes.io/projected/62b9c388-0f74-42fc-bf3d-711322b976d8-kube-api-access-wdxqq\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.100556 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.100569 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.100578 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62b9c388-0f74-42fc-bf3d-711322b976d8-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.104970 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-config" (OuterVolumeSpecName: "config") pod "8611dff0-9ad1-4bba-b687-958d7e887859" (UID: "8611dff0-9ad1-4bba-b687-958d7e887859"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.105295 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "8611dff0-9ad1-4bba-b687-958d7e887859" (UID: "8611dff0-9ad1-4bba-b687-958d7e887859"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.106004 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-scripts" (OuterVolumeSpecName: "scripts") pod "8611dff0-9ad1-4bba-b687-958d7e887859" (UID: "8611dff0-9ad1-4bba-b687-958d7e887859"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.119259 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_8611dff0-9ad1-4bba-b687-958d7e887859/ovsdbserver-nb/0.log" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.120619 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"8611dff0-9ad1-4bba-b687-958d7e887859","Type":"ContainerDied","Data":"9fff2f32a4c1ab9ea09b9e77c1a8faaab8450d76986c65c67904cfe2197ba50e"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.121129 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 19:05:14 crc kubenswrapper[4780]: W0929 19:05:14.137415 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeed2917c_127a_4dbd_b951_6b141853e47c.slice/crio-03eceae30dd4dd56e80e68d7697ea7d37393ed358997d93d4fbba343abce5094 WatchSource:0}: Error finding container 03eceae30dd4dd56e80e68d7697ea7d37393ed358997d93d4fbba343abce5094: Status 404 returned error can't find the container with id 03eceae30dd4dd56e80e68d7697ea7d37393ed358997d93d4fbba343abce5094 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.139753 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 19:05:14 crc kubenswrapper[4780]: W0929 19:05:14.146172 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod622d766f_f43c_434c_9353_2315a6c82ae6.slice/crio-cacf127956e62910a5b82c40d040fb3d978a2f3b2ee90cad223789607a1d6831 WatchSource:0}: Error finding container cacf127956e62910a5b82c40d040fb3d978a2f3b2ee90cad223789607a1d6831: Status 404 returned error can't find the container with id cacf127956e62910a5b82c40d040fb3d978a2f3b2ee90cad223789607a1d6831 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.153864 4780 generic.go:334] "Generic (PLEG): container finished" podID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerID="aae9731ab2bae2a8e2eb268dc27032c196b1db8b2299a7d349eab203f6ba9217" exitCode=143 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.154176 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"02521078-2e58-4ce2-bc12-0b6c3b2ed878","Type":"ContainerDied","Data":"aae9731ab2bae2a8e2eb268dc27032c196b1db8b2299a7d349eab203f6ba9217"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.168316 4780 scope.go:117] "RemoveContainer" containerID="190508865a59a0e7a42d9038f11a9b7f87924fddbc543951d72a888ccb98fb52" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.169991 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2903cdd8-3ab5-4c85-892c-2139eb0bde7c" (UID: "2903cdd8-3ab5-4c85-892c-2139eb0bde7c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.170451 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "8611dff0-9ad1-4bba-b687-958d7e887859" (UID: "8611dff0-9ad1-4bba-b687-958d7e887859"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.170911 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-kube-api-access-wqknp" (OuterVolumeSpecName: "kube-api-access-wqknp") pod "2903cdd8-3ab5-4c85-892c-2139eb0bde7c" (UID: "2903cdd8-3ab5-4c85-892c-2139eb0bde7c"). InnerVolumeSpecName "kube-api-access-wqknp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.171780 4780 generic.go:334] "Generic (PLEG): container finished" podID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerID="4b807f34a3c65b6d836e3bd255f8320430de3cf2180ee8e33b572ba6e6717b3b" exitCode=143 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.171828 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74988cff4c-fmczd" event={"ID":"8150bb34-1bc0-4c45-92f8-9d8d04f611e3","Type":"ContainerDied","Data":"4b807f34a3c65b6d836e3bd255f8320430de3cf2180ee8e33b572ba6e6717b3b"} Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.174265 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7373591d-cf39-4674-8b37-449096f6a3b6-kube-api-access-tt6kq" (OuterVolumeSpecName: "kube-api-access-tt6kq") pod "7373591d-cf39-4674-8b37-449096f6a3b6" (UID: "7373591d-cf39-4674-8b37-449096f6a3b6"). InnerVolumeSpecName "kube-api-access-tt6kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.185394 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "62b9c388-0f74-42fc-bf3d-711322b976d8" (UID: "62b9c388-0f74-42fc-bf3d-711322b976d8"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.185644 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8611dff0-9ad1-4bba-b687-958d7e887859-kube-api-access-76bh8" (OuterVolumeSpecName: "kube-api-access-76bh8") pod "8611dff0-9ad1-4bba-b687-958d7e887859" (UID: "8611dff0-9ad1-4bba-b687-958d7e887859"). InnerVolumeSpecName "kube-api-access-76bh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.203194 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.203227 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.203243 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.203252 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.203263 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt6kq\" (UniqueName: \"kubernetes.io/projected/7373591d-cf39-4674-8b37-449096f6a3b6-kube-api-access-tt6kq\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.203272 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76bh8\" (UniqueName: \"kubernetes.io/projected/8611dff0-9ad1-4bba-b687-958d7e887859-kube-api-access-76bh8\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.203283 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.203292 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqknp\" (UniqueName: \"kubernetes.io/projected/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-kube-api-access-wqknp\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.203325 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8611dff0-9ad1-4bba-b687-958d7e887859-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.302502 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.321787 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.321821 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.398805 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62b9c388-0f74-42fc-bf3d-711322b976d8" (UID: "62b9c388-0f74-42fc-bf3d-711322b976d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.410426 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-58b5d8cc69-dbww7"] Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.410921 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-58b5d8cc69-dbww7" podUID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerName="proxy-httpd" containerID="cri-o://8cdf366e564c41077cac425fb60d05141765f8d392ad2a68245952b02c84e442" gracePeriod=30 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.411428 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-58b5d8cc69-dbww7" podUID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerName="proxy-server" containerID="cri-o://2f710c5ce82ca39295b8a385c093185e1904e19720dae5eaeaae61d9187c8809" gracePeriod=30 Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.422860 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8611dff0-9ad1-4bba-b687-958d7e887859" (UID: "8611dff0-9ad1-4bba-b687-958d7e887859"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.448940 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.448966 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.448984 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.539261 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.167:8776/healthcheck\": read tcp 10.217.0.2:54442->10.217.0.167:8776: read: connection reset by peer" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.544439 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3608c7b9-1f29-491f-9a10-48135b074fa4" (UID: "3608c7b9-1f29-491f-9a10-48135b074fa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.542206 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91a8fa86-9475-490a-9c9f-09233413eab5" (UID: "91a8fa86-9475-490a-9c9f-09233413eab5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.550913 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.550941 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.677642 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2903cdd8-3ab5-4c85-892c-2139eb0bde7c" (UID: "2903cdd8-3ab5-4c85-892c-2139eb0bde7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.681287 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3608c7b9-1f29-491f-9a10-48135b074fa4" (UID: "3608c7b9-1f29-491f-9a10-48135b074fa4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.705256 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-config" (OuterVolumeSpecName: "config") pod "7373591d-cf39-4674-8b37-449096f6a3b6" (UID: "7373591d-cf39-4674-8b37-449096f6a3b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.739179 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "62b9c388-0f74-42fc-bf3d-711322b976d8" (UID: "62b9c388-0f74-42fc-bf3d-711322b976d8"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.749117 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7373591d-cf39-4674-8b37-449096f6a3b6" (UID: "7373591d-cf39-4674-8b37-449096f6a3b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.761405 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.761447 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.761467 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.761481 4780 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.761493 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.784641 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d5b4b9-e63b-464f-8d39-1fea44ce658c" path="/var/lib/kubelet/pods/09d5b4b9-e63b-464f-8d39-1fea44ce658c/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.787421 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa811a8-3957-47f8-a24c-e307ece95cf2" path="/var/lib/kubelet/pods/2aa811a8-3957-47f8-a24c-e307ece95cf2/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.787968 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ead435a-236c-441f-bb69-6a3f2d5c88e3" path="/var/lib/kubelet/pods/2ead435a-236c-441f-bb69-6a3f2d5c88e3/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.788719 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c1ed602-275e-4595-a3fe-171555e9b681" path="/var/lib/kubelet/pods/8c1ed602-275e-4595-a3fe-171555e9b681/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.789927 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4b7df7-18ee-4c71-b7a3-56d799f45bf9" path="/var/lib/kubelet/pods/bb4b7df7-18ee-4c71-b7a3-56d799f45bf9/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.790634 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c" path="/var/lib/kubelet/pods/d0aaf6e3-0d01-4c83-85d9-e396e43a4b6c/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.792317 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6407027-1b5b-454a-83b1-d08d03e5af9c" path="/var/lib/kubelet/pods/d6407027-1b5b-454a-83b1-d08d03e5af9c/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.803424 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def78cb3-faae-4256-9473-926ce387ca60" path="/var/lib/kubelet/pods/def78cb3-faae-4256-9473-926ce387ca60/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.808146 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0c53ce3-391e-424f-ac39-6c7b49502aa2" path="/var/lib/kubelet/pods/f0c53ce3-391e-424f-ac39-6c7b49502aa2/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.811600 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2087ead-f015-40af-b172-6cf166a01cf6" path="/var/lib/kubelet/pods/f2087ead-f015-40af-b172-6cf166a01cf6/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.812575 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc259cf2-313d-4bbf-add3-3df206332827" path="/var/lib/kubelet/pods/fc259cf2-313d-4bbf-add3-3df206332827/volumes" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.847505 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3608c7b9-1f29-491f-9a10-48135b074fa4" (UID: "3608c7b9-1f29-491f-9a10-48135b074fa4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.848601 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7373591d-cf39-4674-8b37-449096f6a3b6" (UID: "7373591d-cf39-4674-8b37-449096f6a3b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.855470 4780 scope.go:117] "RemoveContainer" containerID="5fb6698dd22bbe0d2a1c5ca4ebf010370e1f1c47fe8aceb15033c39385f078fd" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.877562 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.877591 4780 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3608c7b9-1f29-491f-9a10-48135b074fa4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.882464 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7373591d-cf39-4674-8b37-449096f6a3b6" (UID: "7373591d-cf39-4674-8b37-449096f6a3b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.899370 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "91a8fa86-9475-490a-9c9f-09233413eab5" (UID: "91a8fa86-9475-490a-9c9f-09233413eab5"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.930349 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "8611dff0-9ad1-4bba-b687-958d7e887859" (UID: "8611dff0-9ad1-4bba-b687-958d7e887859"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.949257 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancefda4-account-delete-4ndzp"] Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.954923 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7373591d-cf39-4674-8b37-449096f6a3b6" (UID: "7373591d-cf39-4674-8b37-449096f6a3b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.962676 4780 scope.go:117] "RemoveContainer" containerID="1a72c9638b5649fd8982600fc6af41f0dfb4434ab14f6a9fa20981be04918d1c" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.969887 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8611dff0-9ad1-4bba-b687-958d7e887859" (UID: "8611dff0-9ad1-4bba-b687-958d7e887859"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.984002 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.984027 4780 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.984037 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8611dff0-9ad1-4bba-b687-958d7e887859-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.984078 4780 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7373591d-cf39-4674-8b37-449096f6a3b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:14 crc kubenswrapper[4780]: I0929 19:05:14.984087 4780 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a8fa86-9475-490a-9c9f-09233413eab5-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.041310 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hzb5x"] Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.052287 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hzb5x"] Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.081906 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.098760 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.101316 4780 scope.go:117] "RemoveContainer" containerID="554b180a14e22d442412d4d2d0076906c23226808aa92f7beead4a20e385e797" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.125946 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data" (OuterVolumeSpecName: "config-data") pod "2903cdd8-3ab5-4c85-892c-2139eb0bde7c" (UID: "2903cdd8-3ab5-4c85-892c-2139eb0bde7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.145581 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.191175 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-nova-novncproxy-tls-certs\") pod \"a1f2aaf8-27dc-428c-a387-d63424889230\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.191221 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-combined-ca-bundle\") pod \"a1f2aaf8-27dc-428c-a387-d63424889230\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.191253 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-config-data\") pod \"a1f2aaf8-27dc-428c-a387-d63424889230\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.191338 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxnn2\" (UniqueName: \"kubernetes.io/projected/a1f2aaf8-27dc-428c-a387-d63424889230-kube-api-access-dxnn2\") pod \"a1f2aaf8-27dc-428c-a387-d63424889230\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.191363 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-vencrypt-tls-certs\") pod \"a1f2aaf8-27dc-428c-a387-d63424889230\" (UID: \"a1f2aaf8-27dc-428c-a387-d63424889230\") " Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.191638 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2903cdd8-3ab5-4c85-892c-2139eb0bde7c-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.210387 4780 scope.go:117] "RemoveContainer" containerID="ca3350f3db78178478e71ece3c7e24a200961466f9224460cb27c414e7b48f42" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.302551 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2bngj"] Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.315750 4780 generic.go:334] "Generic (PLEG): container finished" podID="a1f2aaf8-27dc-428c-a387-d63424889230" containerID="4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383" exitCode=0 Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.315840 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a1f2aaf8-27dc-428c-a387-d63424889230","Type":"ContainerDied","Data":"4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.315869 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a1f2aaf8-27dc-428c-a387-d63424889230","Type":"ContainerDied","Data":"d5fb7650f7269fd4333da7d14849b4fa6a9bb6ef66af8f767aca3727b13d9a68"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.315948 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.325161 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2bngj"] Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.337424 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerID="391cc111e8cd575fa81674aac39e64f1e0c3b2f3fc46853f4758411b706b35aa" exitCode=0 Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.337548 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d","Type":"ContainerDied","Data":"391cc111e8cd575fa81674aac39e64f1e0c3b2f3fc46853f4758411b706b35aa"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.344508 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi0c9e-account-delete-l4t5r"] Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.348372 4780 generic.go:334] "Generic (PLEG): container finished" podID="628b549e-6d99-43d4-94bb-61b457f4c37b" containerID="ff8a529133b59522aa5a47a19801e5fe0c76dbf90cf9186ffe730d3e74db9aba" exitCode=0 Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.348459 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"628b549e-6d99-43d4-94bb-61b457f4c37b","Type":"ContainerDied","Data":"ff8a529133b59522aa5a47a19801e5fe0c76dbf90cf9186ffe730d3e74db9aba"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.352243 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0c9e-account-create-z6hgb"] Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.354324 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron33e2-account-delete-nr86j" event={"ID":"83f061df-a5ff-4db1-b87f-4106a5e56b55","Type":"ContainerStarted","Data":"9276d955249f1f3bcb322a56e077252cb95db4ceccbb258e14cb001bb0637021"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.355400 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancefda4-account-delete-4ndzp" event={"ID":"f271f9ca-bced-4144-b779-06e7422d9a63","Type":"ContainerStarted","Data":"7da71a54065fafc75a333926d18e4c6e000967115b3c89d4cc36c0a56cbc1e01"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.357088 4780 generic.go:334] "Generic (PLEG): container finished" podID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerID="8cdf366e564c41077cac425fb60d05141765f8d392ad2a68245952b02c84e442" exitCode=0 Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.357129 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58b5d8cc69-dbww7" event={"ID":"6422eb63-373a-4b79-88b0-ddd623f7bd79","Type":"ContainerDied","Data":"8cdf366e564c41077cac425fb60d05141765f8d392ad2a68245952b02c84e442"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.359226 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementb644-account-delete-t2p8t" event={"ID":"503714fd-6dcf-4b1d-8806-dd78a3e85b7f","Type":"ContainerStarted","Data":"4895d72a5fe414b04d0bbbd73c376248068817a47d2969abfb8fe95e9c93bd75"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.361587 4780 generic.go:334] "Generic (PLEG): container finished" podID="5d5ccc95-6c2c-4f3c-884b-456cf28d6db4" containerID="5819364439d5cb95e55e3ab9534211d4daffea11f49034a2d033c40c6ec821a9" exitCode=0 Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.361714 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican32a4-account-delete-xckl5" event={"ID":"5d5ccc95-6c2c-4f3c-884b-456cf28d6db4","Type":"ContainerDied","Data":"5819364439d5cb95e55e3ab9534211d4daffea11f49034a2d033c40c6ec821a9"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.362480 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0c9e-account-create-z6hgb"] Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.408625 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi0c9e-account-delete-l4t5r" event={"ID":"622d766f-f43c-434c-9353-2315a6c82ae6","Type":"ContainerStarted","Data":"cacf127956e62910a5b82c40d040fb3d978a2f3b2ee90cad223789607a1d6831"} Sep 29 19:05:15 crc kubenswrapper[4780]: E0929 19:05:15.417358 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1bf3800786032f687dfb373cbc1d24ace1919441397847f347217bf7a840db61" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 29 19:05:15 crc kubenswrapper[4780]: E0929 19:05:15.430319 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1bf3800786032f687dfb373cbc1d24ace1919441397847f347217bf7a840db61" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 29 19:05:15 crc kubenswrapper[4780]: E0929 19:05:15.435332 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1bf3800786032f687dfb373cbc1d24ace1919441397847f347217bf7a840db61" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 29 19:05:15 crc kubenswrapper[4780]: E0929 19:05:15.435390 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="bc401926-3969-448c-9910-22572fecb168" containerName="nova-cell1-conductor-conductor" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.442115 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f2aaf8-27dc-428c-a387-d63424889230-kube-api-access-dxnn2" (OuterVolumeSpecName: "kube-api-access-dxnn2") pod "a1f2aaf8-27dc-428c-a387-d63424889230" (UID: "a1f2aaf8-27dc-428c-a387-d63424889230"). InnerVolumeSpecName "kube-api-access-dxnn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.443217 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.443311 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell063ac-account-delete-rvt6m" event={"ID":"eed2917c-127a-4dbd-b951-6b141853e47c","Type":"ContainerStarted","Data":"243f2148e91378364f15ab121c1c59d743e4fb88de99cd62a3d13f1cfef6c462"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.443368 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell063ac-account-delete-rvt6m" event={"ID":"eed2917c-127a-4dbd-b951-6b141853e47c","Type":"ContainerStarted","Data":"03eceae30dd4dd56e80e68d7697ea7d37393ed358997d93d4fbba343abce5094"} Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.443980 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell063ac-account-delete-rvt6m" podUID="eed2917c-127a-4dbd-b951-6b141853e47c" containerName="mariadb-account-delete" containerID="cri-o://243f2148e91378364f15ab121c1c59d743e4fb88de99cd62a3d13f1cfef6c462" gracePeriod=30 Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.502819 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxnn2\" (UniqueName: \"kubernetes.io/projected/a1f2aaf8-27dc-428c-a387-d63424889230-kube-api-access-dxnn2\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.619865 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "62b9c388-0f74-42fc-bf3d-711322b976d8" (UID: "62b9c388-0f74-42fc-bf3d-711322b976d8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.680157 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-config-data" (OuterVolumeSpecName: "config-data") pod "a1f2aaf8-27dc-428c-a387-d63424889230" (UID: "a1f2aaf8-27dc-428c-a387-d63424889230"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.701735 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1f2aaf8-27dc-428c-a387-d63424889230" (UID: "a1f2aaf8-27dc-428c-a387-d63424889230"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.702734 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "a1f2aaf8-27dc-428c-a387-d63424889230" (UID: "a1f2aaf8-27dc-428c-a387-d63424889230"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.707488 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.707525 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.707536 4780 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b9c388-0f74-42fc-bf3d-711322b976d8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.707547 4780 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.741176 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "a1f2aaf8-27dc-428c-a387-d63424889230" (UID: "a1f2aaf8-27dc-428c-a387-d63424889230"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:15 crc kubenswrapper[4780]: I0929 19:05:15.809610 4780 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1f2aaf8-27dc-428c-a387-d63424889230-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.363525 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell063ac-account-delete-rvt6m" podStartSLOduration=6.363465686 podStartE2EDuration="6.363465686s" podCreationTimestamp="2025-09-29 19:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:05:15.479098993 +0000 UTC m=+1315.427397037" watchObservedRunningTime="2025-09-29 19:05:16.363465686 +0000 UTC m=+1316.311763730" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.364225 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.364536 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="ceilometer-central-agent" containerID="cri-o://f88db872bb531d67943f47affb487b5a77c5ff64bdf19d2564052e453ae34187" gracePeriod=30 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.365019 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="proxy-httpd" containerID="cri-o://a13ab8e97bfc1c433e41ba1fdbdc614073a33b3747ee1e7b9e9cd3cb214ce595" gracePeriod=30 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.365305 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="sg-core" containerID="cri-o://f64e558c1911bea7506b2cdd5c000f9c4c3d8816f4e4b6adc9002538b83090a4" gracePeriod=30 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.365359 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="ceilometer-notification-agent" containerID="cri-o://2206fdfda1b3679c9eaab7892ccf4c32611624a3996175a4dd0502159b261a25" gracePeriod=30 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.398722 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.398988 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ed88e38f-cb35-4072-8f9f-1c6ab980ec03" containerName="kube-state-metrics" containerID="cri-o://64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a" gracePeriod=30 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.448701 4780 scope.go:117] "RemoveContainer" containerID="bb569064870cf713912feaf5a437a10267cc1ffdc972eec11af4085d6191acce" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.473062 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.513262 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.515668 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.523558 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.527887 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.528036 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-galera-tls-certs\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.528130 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-kolla-config\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.528180 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-default\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.528315 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-secrets\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.528356 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-operator-scripts\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.528443 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cqxn\" (UniqueName: \"kubernetes.io/projected/628b549e-6d99-43d4-94bb-61b457f4c37b-kube-api-access-4cqxn\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.528484 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-generated\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.528524 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-combined-ca-bundle\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.532970 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.533599 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.543935 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.546059 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.548442 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628b549e-6d99-43d4-94bb-61b457f4c37b-kube-api-access-4cqxn" (OuterVolumeSpecName: "kube-api-access-4cqxn") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "kube-api-access-4cqxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.557745 4780 generic.go:334] "Generic (PLEG): container finished" podID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerID="3f91861bf876fab40cbb103b723d6d21a4fb3a2aeadedbbda6de0035a6ee2aa7" exitCode=0 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.557814 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"02521078-2e58-4ce2-bc12-0b6c3b2ed878","Type":"ContainerDied","Data":"3f91861bf876fab40cbb103b723d6d21a4fb3a2aeadedbbda6de0035a6ee2aa7"} Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.593122 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-secrets" (OuterVolumeSpecName: "secrets") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.598808 4780 generic.go:334] "Generic (PLEG): container finished" podID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerID="eeee69b0a809e51c2de8aae84184f344369f4e4f6fab7ebfa4f65f602565ed13" exitCode=0 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.601908 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e14f7a20-d45e-4662-b0db-4af394c7daed","Type":"ContainerDied","Data":"eeee69b0a809e51c2de8aae84184f344369f4e4f6fab7ebfa4f65f602565ed13"} Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.619543 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.626076 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.634552 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.634788 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-scripts\") pod \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.634877 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg62t\" (UniqueName: \"kubernetes.io/projected/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-kube-api-access-bg62t\") pod \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.634946 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data-custom\") pod \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.635022 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-combined-ca-bundle\") pod \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.635191 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-etc-machine-id\") pod \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.635262 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data\") pod \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.635489 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-public-tls-certs\") pod \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.635521 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-internal-tls-certs\") pod \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.635600 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"628b549e-6d99-43d4-94bb-61b457f4c37b\" (UID: \"628b549e-6d99-43d4-94bb-61b457f4c37b\") " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.635682 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-logs\") pod \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\" (UID: \"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d\") " Sep 29 19:05:16 crc kubenswrapper[4780]: W0929 19:05:16.637221 4780 mount_helper_common.go:34] Warning: mount cleanup skipped because path does not exist: /var/lib/kubelet/pods/628b549e-6d99-43d4-94bb-61b457f4c37b/volumes/kubernetes.io~local-volume/local-storage11-crc Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.638643 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.637354 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" (UID: "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.641603 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-logs" (OuterVolumeSpecName: "logs") pod "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" (UID: "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.651305 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-jhslf"] Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654075 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6764d576f6-q7trv" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:45930->10.217.0.156:9311: read: connection reset by peer" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654196 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6764d576f6-q7trv" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:45914->10.217.0.156:9311: read: connection reset by peer" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654478 4780 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-kolla-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654504 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-default\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654514 4780 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654524 4780 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654532 4780 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628b549e-6d99-43d4-94bb-61b457f4c37b-operator-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654541 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cqxn\" (UniqueName: \"kubernetes.io/projected/628b549e-6d99-43d4-94bb-61b457f4c37b-kube-api-access-4cqxn\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654552 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/628b549e-6d99-43d4-94bb-61b457f4c37b-config-data-generated\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654577 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.654587 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.665421 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-kube-api-access-bg62t" (OuterVolumeSpecName: "kube-api-access-bg62t") pod "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" (UID: "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d"). InnerVolumeSpecName "kube-api-access-bg62t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.666416 4780 generic.go:334] "Generic (PLEG): container finished" podID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerID="2f710c5ce82ca39295b8a385c093185e1904e19720dae5eaeaae61d9187c8809" exitCode=0 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.666495 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cc449b9dc-jhslf"] Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.666521 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58b5d8cc69-dbww7" event={"ID":"6422eb63-373a-4b79-88b0-ddd623f7bd79","Type":"ContainerDied","Data":"2f710c5ce82ca39295b8a385c093185e1904e19720dae5eaeaae61d9187c8809"} Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.666538 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58b5d8cc69-dbww7" event={"ID":"6422eb63-373a-4b79-88b0-ddd623f7bd79","Type":"ContainerDied","Data":"5c83268540159aa38b72bf722a5fe2387b9ac1dcc4d6ec487e0383efdab63ac6"} Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.666548 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c83268540159aa38b72bf722a5fe2387b9ac1dcc4d6ec487e0383efdab63ac6" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.689798 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.695234 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-scripts" (OuterVolumeSpecName: "scripts") pod "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" (UID: "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.709222 4780 generic.go:334] "Generic (PLEG): container finished" podID="e42e5bce-9395-4758-8121-35408b6df2e2" containerID="f64e558c1911bea7506b2cdd5c000f9c4c3d8816f4e4b6adc9002538b83090a4" exitCode=2 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.709308 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerDied","Data":"f64e558c1911bea7506b2cdd5c000f9c4c3d8816f4e4b6adc9002538b83090a4"} Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.732304 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"628b549e-6d99-43d4-94bb-61b457f4c37b","Type":"ContainerDied","Data":"aa9d73c831e1295dcb9e12df8b4e963b68ccc3a2e1d42a671190107f6e82a599"} Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.732424 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.747862 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" (UID: "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.766970 4780 generic.go:334] "Generic (PLEG): container finished" podID="eed2917c-127a-4dbd-b951-6b141853e47c" containerID="243f2148e91378364f15ab121c1c59d743e4fb88de99cd62a3d13f1cfef6c462" exitCode=0 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.775904 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.775933 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.775945 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg62t\" (UniqueName: \"kubernetes.io/projected/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-kube-api-access-bg62t\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.775956 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.799058 4780 generic.go:334] "Generic (PLEG): container finished" podID="622d766f-f43c-434c-9353-2315a6c82ae6" containerID="1bf559685fa2ec4cdd7e6a75b9791a1126fabef80566a5ec0ad7a61e63c638ce" exitCode=0 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.835973 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b25d81f-c401-483c-b772-d2570b578c8c" path="/var/lib/kubelet/pods/1b25d81f-c401-483c-b772-d2570b578c8c/volumes" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.840538 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.866391 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" path="/var/lib/kubelet/pods/2903cdd8-3ab5-4c85-892c-2139eb0bde7c/volumes" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.869644 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:38904->10.217.0.202:8775: read: connection reset by peer" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.869894 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:38892->10.217.0.202:8775: read: connection reset by peer" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.874978 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3608c7b9-1f29-491f-9a10-48135b074fa4" path="/var/lib/kubelet/pods/3608c7b9-1f29-491f-9a10-48135b074fa4/volumes" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.879497 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.886324 4780 generic.go:334] "Generic (PLEG): container finished" podID="503714fd-6dcf-4b1d-8806-dd78a3e85b7f" containerID="325c5a786e1853685f85b6fc6f7bdd59ae3bf16fce0c288ff4e80cd0ba149002" exitCode=0 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.886654 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7373591d-cf39-4674-8b37-449096f6a3b6" path="/var/lib/kubelet/pods/7373591d-cf39-4674-8b37-449096f6a3b6/volumes" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.890338 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" path="/var/lib/kubelet/pods/8611dff0-9ad1-4bba-b687-958d7e887859/volumes" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.891001 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a8fa86-9475-490a-9c9f-09233413eab5" path="/var/lib/kubelet/pods/91a8fa86-9475-490a-9c9f-09233413eab5/volumes" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.894373 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" (UID: "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.896828 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1f2aaf8-27dc-428c-a387-d63424889230" path="/var/lib/kubelet/pods/a1f2aaf8-27dc-428c-a387-d63424889230/volumes" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.898410 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda6e087-f771-4d4c-870a-e6a1c9d1c98c" path="/var/lib/kubelet/pods/fda6e087-f771-4d4c-870a-e6a1c9d1c98c/volumes" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.937251 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "628b549e-6d99-43d4-94bb-61b457f4c37b" (UID: "628b549e-6d99-43d4-94bb-61b457f4c37b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.950252 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data" (OuterVolumeSpecName: "config-data") pod "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" (UID: "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.952967 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" (UID: "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.962789 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.969162 4780 generic.go:334] "Generic (PLEG): container finished" podID="83f061df-a5ff-4db1-b87f-4106a5e56b55" containerID="3383dec5afb930a94bac2f13b88f7cee0613113811cd7f239db85c1098ebc8d7" exitCode=0 Sep 29 19:05:16 crc kubenswrapper[4780]: I0929 19:05:16.991384 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" (UID: "f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:16.996652 4780 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/628b549e-6d99-43d4-94bb-61b457f4c37b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:16.996692 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:16.996705 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:16.996718 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:16.996731 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015584 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell063ac-account-delete-rvt6m" event={"ID":"eed2917c-127a-4dbd-b951-6b141853e47c","Type":"ContainerDied","Data":"243f2148e91378364f15ab121c1c59d743e4fb88de99cd62a3d13f1cfef6c462"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015618 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell063ac-account-delete-rvt6m" event={"ID":"eed2917c-127a-4dbd-b951-6b141853e47c","Type":"ContainerDied","Data":"03eceae30dd4dd56e80e68d7697ea7d37393ed358997d93d4fbba343abce5094"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015629 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03eceae30dd4dd56e80e68d7697ea7d37393ed358997d93d4fbba343abce5094" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015637 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi0c9e-account-delete-l4t5r" event={"ID":"622d766f-f43c-434c-9353-2315a6c82ae6","Type":"ContainerDied","Data":"1bf559685fa2ec4cdd7e6a75b9791a1126fabef80566a5ec0ad7a61e63c638ce"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015649 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican32a4-account-delete-xckl5" event={"ID":"5d5ccc95-6c2c-4f3c-884b-456cf28d6db4","Type":"ContainerDied","Data":"c0623af1e08421a5875b41b1bfe12d73958a41aa399f6f1be9762530d51b1405"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015662 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0623af1e08421a5875b41b1bfe12d73958a41aa399f6f1be9762530d51b1405" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015670 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementb644-account-delete-t2p8t" event={"ID":"503714fd-6dcf-4b1d-8806-dd78a3e85b7f","Type":"ContainerDied","Data":"325c5a786e1853685f85b6fc6f7bdd59ae3bf16fce0c288ff4e80cd0ba149002"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015682 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d","Type":"ContainerDied","Data":"5b0cd11d894df6419f2647d47c9d408f4434a3a224381a40853e23c6715badbf"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015704 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015717 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron33e2-account-delete-nr86j" event={"ID":"83f061df-a5ff-4db1-b87f-4106a5e56b55","Type":"ContainerDied","Data":"3383dec5afb930a94bac2f13b88f7cee0613113811cd7f239db85c1098ebc8d7"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015731 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015745 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015759 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-czsds"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015769 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-czsds"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015778 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-v9wvf"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015787 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-v9wvf"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015799 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-65cff5765c-kflf7"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015810 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015820 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-xfmwd"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015829 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-xfmwd"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015839 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-928f-account-create-wfndn"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.015848 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-928f-account-create-wfndn"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.016230 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="58ef0b7e-a06d-49a2-824e-9f088c267a97" containerName="memcached" containerID="cri-o://fa44c2b6e56600dfb6c99d6fb0e419237762ff70fabe663a6e3f18eded510c50" gracePeriod=30 Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.016359 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-65cff5765c-kflf7" podUID="ef4fe84d-ff10-4ed2-938a-669c30748336" containerName="keystone-api" containerID="cri-o://c0d63b73993fb464d59534865d2e91fd588cd9d7e1421e00fa83f404e4d2d957" gracePeriod=30 Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.019706 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican32a4-account-delete-xckl5" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.063848 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.090172 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="739143154f41eccfb13a2b48adb19e687f9f167c8167b59c2ccf652c349ef90e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.096520 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell063ac-account-delete-rvt6m" Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.096550 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="739143154f41eccfb13a2b48adb19e687f9f167c8167b59c2ccf652c349ef90e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.097538 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czx87\" (UniqueName: \"kubernetes.io/projected/5d5ccc95-6c2c-4f3c-884b-456cf28d6db4-kube-api-access-czx87\") pod \"5d5ccc95-6c2c-4f3c-884b-456cf28d6db4\" (UID: \"5d5ccc95-6c2c-4f3c-884b-456cf28d6db4\") " Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.099100 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="739143154f41eccfb13a2b48adb19e687f9f167c8167b59c2ccf652c349ef90e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.099166 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" containerName="nova-cell0-conductor-conductor" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.103665 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.109812 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.116091 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.126672 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5ccc95-6c2c-4f3c-884b-456cf28d6db4-kube-api-access-czx87" (OuterVolumeSpecName: "kube-api-access-czx87") pod "5d5ccc95-6c2c-4f3c-884b-456cf28d6db4" (UID: "5d5ccc95-6c2c-4f3c-884b-456cf28d6db4"). InnerVolumeSpecName "kube-api-access-czx87". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.161128 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b644ef5b1ab2aed1b81290fa9144d38c32c66e7d427c70b6dfb41dd252e0ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.184183 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b644ef5b1ab2aed1b81290fa9144d38c32c66e7d427c70b6dfb41dd252e0ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.199269 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b644ef5b1ab2aed1b81290fa9144d38c32c66e7d427c70b6dfb41dd252e0ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.199354 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ec846e3f-c11b-4818-a15b-9f855ed48a56" containerName="nova-scheduler-scheduler" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199378 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmtqv\" (UniqueName: \"kubernetes.io/projected/eed2917c-127a-4dbd-b951-6b141853e47c-kube-api-access-xmtqv\") pod \"eed2917c-127a-4dbd-b951-6b141853e47c\" (UID: \"eed2917c-127a-4dbd-b951-6b141853e47c\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199457 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-internal-tls-certs\") pod \"6422eb63-373a-4b79-88b0-ddd623f7bd79\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199502 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-log-httpd\") pod \"6422eb63-373a-4b79-88b0-ddd623f7bd79\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199585 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-combined-ca-bundle\") pod \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199631 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-public-tls-certs\") pod \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199674 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmxxr\" (UniqueName: \"kubernetes.io/projected/02521078-2e58-4ce2-bc12-0b6c3b2ed878-kube-api-access-kmxxr\") pod \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199704 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-internal-tls-certs\") pod \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199739 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-config-data\") pod \"6422eb63-373a-4b79-88b0-ddd623f7bd79\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199809 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-run-httpd\") pod \"6422eb63-373a-4b79-88b0-ddd623f7bd79\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199859 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-combined-ca-bundle\") pod \"6422eb63-373a-4b79-88b0-ddd623f7bd79\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199891 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-public-tls-certs\") pod \"6422eb63-373a-4b79-88b0-ddd623f7bd79\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199927 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-config-data\") pod \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.199957 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm2hj\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-kube-api-access-jm2hj\") pod \"6422eb63-373a-4b79-88b0-ddd623f7bd79\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.200010 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-etc-swift\") pod \"6422eb63-373a-4b79-88b0-ddd623f7bd79\" (UID: \"6422eb63-373a-4b79-88b0-ddd623f7bd79\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.200030 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02521078-2e58-4ce2-bc12-0b6c3b2ed878-logs\") pod \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\" (UID: \"02521078-2e58-4ce2-bc12-0b6c3b2ed878\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.201130 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czx87\" (UniqueName: \"kubernetes.io/projected/5d5ccc95-6c2c-4f3c-884b-456cf28d6db4-kube-api-access-czx87\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.204079 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6422eb63-373a-4b79-88b0-ddd623f7bd79" (UID: "6422eb63-373a-4b79-88b0-ddd623f7bd79"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.204139 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6422eb63-373a-4b79-88b0-ddd623f7bd79" (UID: "6422eb63-373a-4b79-88b0-ddd623f7bd79"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.204557 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02521078-2e58-4ce2-bc12-0b6c3b2ed878-logs" (OuterVolumeSpecName: "logs") pod "02521078-2e58-4ce2-bc12-0b6c3b2ed878" (UID: "02521078-2e58-4ce2-bc12-0b6c3b2ed878"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.251906 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed2917c-127a-4dbd-b951-6b141853e47c-kube-api-access-xmtqv" (OuterVolumeSpecName: "kube-api-access-xmtqv") pod "eed2917c-127a-4dbd-b951-6b141853e47c" (UID: "eed2917c-127a-4dbd-b951-6b141853e47c"). InnerVolumeSpecName "kube-api-access-xmtqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.255028 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6422eb63-373a-4b79-88b0-ddd623f7bd79" (UID: "6422eb63-373a-4b79-88b0-ddd623f7bd79"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.292968 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02521078-2e58-4ce2-bc12-0b6c3b2ed878-kube-api-access-kmxxr" (OuterVolumeSpecName: "kube-api-access-kmxxr") pod "02521078-2e58-4ce2-bc12-0b6c3b2ed878" (UID: "02521078-2e58-4ce2-bc12-0b6c3b2ed878"). InnerVolumeSpecName "kube-api-access-kmxxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.293644 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-kube-api-access-jm2hj" (OuterVolumeSpecName: "kube-api-access-jm2hj") pod "6422eb63-373a-4b79-88b0-ddd623f7bd79" (UID: "6422eb63-373a-4b79-88b0-ddd623f7bd79"). InnerVolumeSpecName "kube-api-access-jm2hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.303238 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm2hj\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-kube-api-access-jm2hj\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.303269 4780 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6422eb63-373a-4b79-88b0-ddd623f7bd79-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.303279 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02521078-2e58-4ce2-bc12-0b6c3b2ed878-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.303288 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmtqv\" (UniqueName: \"kubernetes.io/projected/eed2917c-127a-4dbd-b951-6b141853e47c-kube-api-access-xmtqv\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.303297 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.303306 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmxxr\" (UniqueName: \"kubernetes.io/projected/02521078-2e58-4ce2-bc12-0b6c3b2ed878-kube-api-access-kmxxr\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.303317 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6422eb63-373a-4b79-88b0-ddd623f7bd79-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.308826 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02521078-2e58-4ce2-bc12-0b6c3b2ed878" (UID: "02521078-2e58-4ce2-bc12-0b6c3b2ed878"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.313269 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-config-data" (OuterVolumeSpecName: "config-data") pod "02521078-2e58-4ce2-bc12-0b6c3b2ed878" (UID: "02521078-2e58-4ce2-bc12-0b6c3b2ed878"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.327118 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="48191511-38e9-46d2-82f8-77453769927c" containerName="galera" containerID="cri-o://f87b8bafb323301052d22ea81d2721d5221500537424fea022247a8e792a03e3" gracePeriod=30 Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.335316 4780 scope.go:117] "RemoveContainer" containerID="f9639917e8369f971bbda3bb865e49a2021d379743b2c56bab33019529c9a847" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.345539 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6422eb63-373a-4b79-88b0-ddd623f7bd79" (UID: "6422eb63-373a-4b79-88b0-ddd623f7bd79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.354243 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6422eb63-373a-4b79-88b0-ddd623f7bd79" (UID: "6422eb63-373a-4b79-88b0-ddd623f7bd79"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.354484 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "02521078-2e58-4ce2-bc12-0b6c3b2ed878" (UID: "02521078-2e58-4ce2-bc12-0b6c3b2ed878"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.383300 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-config-data" (OuterVolumeSpecName: "config-data") pod "6422eb63-373a-4b79-88b0-ddd623f7bd79" (UID: "6422eb63-373a-4b79-88b0-ddd623f7bd79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.400793 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6422eb63-373a-4b79-88b0-ddd623f7bd79" (UID: "6422eb63-373a-4b79-88b0-ddd623f7bd79"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.405870 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.405902 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.405912 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.405922 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.405930 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.405940 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.406001 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.406122 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data podName:d2ee2741-9417-4698-b550-7c596d00d271 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:25.40610307 +0000 UTC m=+1325.354401114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data") pod "rabbitmq-server-0" (UID: "d2ee2741-9417-4698-b550-7c596d00d271") : configmap "rabbitmq-config-data" not found Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.406164 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6422eb63-373a-4b79-88b0-ddd623f7bd79-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.409573 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "02521078-2e58-4ce2-bc12-0b6c3b2ed878" (UID: "02521078-2e58-4ce2-bc12-0b6c3b2ed878"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.508301 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02521078-2e58-4ce2-bc12-0b6c3b2ed878-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.512968 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.523107 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.540532 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.541544 4780 scope.go:117] "RemoveContainer" containerID="4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.559330 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.602310 4780 scope.go:117] "RemoveContainer" containerID="4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383" Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.606818 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383\": container with ID starting with 4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383 not found: ID does not exist" containerID="4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.606873 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383"} err="failed to get container status \"4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383\": rpc error: code = NotFound desc = could not find container \"4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383\": container with ID starting with 4e52ea5f9226b6cc6c59248e7c469572d23ab03793c0f0e7ff375f4506465383 not found: ID does not exist" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.606907 4780 scope.go:117] "RemoveContainer" containerID="ff8a529133b59522aa5a47a19801e5fe0c76dbf90cf9186ffe730d3e74db9aba" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.609758 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-combined-ca-bundle\") pod \"e14f7a20-d45e-4662-b0db-4af394c7daed\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.609842 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6xts\" (UniqueName: \"kubernetes.io/projected/e14f7a20-d45e-4662-b0db-4af394c7daed-kube-api-access-l6xts\") pod \"e14f7a20-d45e-4662-b0db-4af394c7daed\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610023 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-public-tls-certs\") pod \"e14f7a20-d45e-4662-b0db-4af394c7daed\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610090 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-config-data\") pod \"e14f7a20-d45e-4662-b0db-4af394c7daed\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610112 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-certs\") pod \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610168 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-scripts\") pod \"e14f7a20-d45e-4662-b0db-4af394c7daed\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610231 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e14f7a20-d45e-4662-b0db-4af394c7daed\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610287 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-combined-ca-bundle\") pod \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610317 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-config\") pod \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610340 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-logs\") pod \"e14f7a20-d45e-4662-b0db-4af394c7daed\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610381 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-httpd-run\") pod \"e14f7a20-d45e-4662-b0db-4af394c7daed\" (UID: \"e14f7a20-d45e-4662-b0db-4af394c7daed\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.610417 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d524t\" (UniqueName: \"kubernetes.io/projected/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-api-access-d524t\") pod \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\" (UID: \"ed88e38f-cb35-4072-8f9f-1c6ab980ec03\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.618626 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-api-access-d524t" (OuterVolumeSpecName: "kube-api-access-d524t") pod "ed88e38f-cb35-4072-8f9f-1c6ab980ec03" (UID: "ed88e38f-cb35-4072-8f9f-1c6ab980ec03"). InnerVolumeSpecName "kube-api-access-d524t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.619464 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-logs" (OuterVolumeSpecName: "logs") pod "e14f7a20-d45e-4662-b0db-4af394c7daed" (UID: "e14f7a20-d45e-4662-b0db-4af394c7daed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.619827 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e14f7a20-d45e-4662-b0db-4af394c7daed" (UID: "e14f7a20-d45e-4662-b0db-4af394c7daed"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.626141 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-scripts" (OuterVolumeSpecName: "scripts") pod "e14f7a20-d45e-4662-b0db-4af394c7daed" (UID: "e14f7a20-d45e-4662-b0db-4af394c7daed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.626872 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "e14f7a20-d45e-4662-b0db-4af394c7daed" (UID: "e14f7a20-d45e-4662-b0db-4af394c7daed"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.654140 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementb644-account-delete-t2p8t" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.655189 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e14f7a20-d45e-4662-b0db-4af394c7daed-kube-api-access-l6xts" (OuterVolumeSpecName: "kube-api-access-l6xts") pod "e14f7a20-d45e-4662-b0db-4af394c7daed" (UID: "e14f7a20-d45e-4662-b0db-4af394c7daed"). InnerVolumeSpecName "kube-api-access-l6xts". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.655984 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi0c9e-account-delete-l4t5r" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.659630 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "ed88e38f-cb35-4072-8f9f-1c6ab980ec03" (UID: "ed88e38f-cb35-4072-8f9f-1c6ab980ec03"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.674304 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.680669 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.682359 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e14f7a20-d45e-4662-b0db-4af394c7daed" (UID: "e14f7a20-d45e-4662-b0db-4af394c7daed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.687629 4780 scope.go:117] "RemoveContainer" containerID="8af73da64b8697605018768f5efc2298ee5aa5426fed89f61dfd4c0b10c58708" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712004 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlt7x\" (UniqueName: \"kubernetes.io/projected/503714fd-6dcf-4b1d-8806-dd78a3e85b7f-kube-api-access-vlt7x\") pod \"503714fd-6dcf-4b1d-8806-dd78a3e85b7f\" (UID: \"503714fd-6dcf-4b1d-8806-dd78a3e85b7f\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712311 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfh44\" (UniqueName: \"kubernetes.io/projected/622d766f-f43c-434c-9353-2315a6c82ae6-kube-api-access-kfh44\") pod \"622d766f-f43c-434c-9353-2315a6c82ae6\" (UID: \"622d766f-f43c-434c-9353-2315a6c82ae6\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712722 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712753 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712764 4780 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712775 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712783 4780 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e14f7a20-d45e-4662-b0db-4af394c7daed-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712792 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d524t\" (UniqueName: \"kubernetes.io/projected/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-api-access-d524t\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712801 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.712811 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6xts\" (UniqueName: \"kubernetes.io/projected/e14f7a20-d45e-4662-b0db-4af394c7daed-kube-api-access-l6xts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.717335 4780 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:17 crc kubenswrapper[4780]: E0929 19:05:17.717434 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data podName:b90472c3-a09d-433c-922b-d164a11636e6 nodeName:}" failed. No retries permitted until 2025-09-29 19:05:25.717413409 +0000 UTC m=+1325.665711453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b90472c3-a09d-433c-922b-d164a11636e6") : configmap "rabbitmq-cell1-config-data" not found Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.747426 4780 scope.go:117] "RemoveContainer" containerID="391cc111e8cd575fa81674aac39e64f1e0c3b2f3fc46853f4758411b706b35aa" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.755582 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503714fd-6dcf-4b1d-8806-dd78a3e85b7f-kube-api-access-vlt7x" (OuterVolumeSpecName: "kube-api-access-vlt7x") pod "503714fd-6dcf-4b1d-8806-dd78a3e85b7f" (UID: "503714fd-6dcf-4b1d-8806-dd78a3e85b7f"). InnerVolumeSpecName "kube-api-access-vlt7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.755692 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/622d766f-f43c-434c-9353-2315a6c82ae6-kube-api-access-kfh44" (OuterVolumeSpecName: "kube-api-access-kfh44") pod "622d766f-f43c-434c-9353-2315a6c82ae6" (UID: "622d766f-f43c-434c-9353-2315a6c82ae6"). InnerVolumeSpecName "kube-api-access-kfh44". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.760601 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "ed88e38f-cb35-4072-8f9f-1c6ab980ec03" (UID: "ed88e38f-cb35-4072-8f9f-1c6ab980ec03"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.763619 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed88e38f-cb35-4072-8f9f-1c6ab980ec03" (UID: "ed88e38f-cb35-4072-8f9f-1c6ab980ec03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.787974 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e14f7a20-d45e-4662-b0db-4af394c7daed" (UID: "e14f7a20-d45e-4662-b0db-4af394c7daed"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.815174 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-combined-ca-bundle\") pod \"6c538b0f-23b3-440d-9775-5f33f7badfd4\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816276 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-httpd-run\") pod \"b7f300da-65dd-4c6e-ae4a-63b797768651\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816355 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c538b0f-23b3-440d-9775-5f33f7badfd4-logs\") pod \"6c538b0f-23b3-440d-9775-5f33f7badfd4\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816408 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-scripts\") pod \"b7f300da-65dd-4c6e-ae4a-63b797768651\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816483 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-public-tls-certs\") pod \"6c538b0f-23b3-440d-9775-5f33f7badfd4\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816524 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-internal-tls-certs\") pod \"b7f300da-65dd-4c6e-ae4a-63b797768651\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816562 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b7f300da-65dd-4c6e-ae4a-63b797768651\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816606 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-combined-ca-bundle\") pod \"b7f300da-65dd-4c6e-ae4a-63b797768651\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816650 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbtvd\" (UniqueName: \"kubernetes.io/projected/b7f300da-65dd-4c6e-ae4a-63b797768651-kube-api-access-kbtvd\") pod \"b7f300da-65dd-4c6e-ae4a-63b797768651\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816734 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grl7p\" (UniqueName: \"kubernetes.io/projected/6c538b0f-23b3-440d-9775-5f33f7badfd4-kube-api-access-grl7p\") pod \"6c538b0f-23b3-440d-9775-5f33f7badfd4\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816739 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b7f300da-65dd-4c6e-ae4a-63b797768651" (UID: "b7f300da-65dd-4c6e-ae4a-63b797768651"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816768 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-logs\") pod \"b7f300da-65dd-4c6e-ae4a-63b797768651\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816809 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data\") pod \"6c538b0f-23b3-440d-9775-5f33f7badfd4\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816881 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-config-data\") pod \"b7f300da-65dd-4c6e-ae4a-63b797768651\" (UID: \"b7f300da-65dd-4c6e-ae4a-63b797768651\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.816931 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-internal-tls-certs\") pod \"6c538b0f-23b3-440d-9775-5f33f7badfd4\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.817074 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data-custom\") pod \"6c538b0f-23b3-440d-9775-5f33f7badfd4\" (UID: \"6c538b0f-23b3-440d-9775-5f33f7badfd4\") " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.817904 4780 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.817930 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.817946 4780 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.817961 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfh44\" (UniqueName: \"kubernetes.io/projected/622d766f-f43c-434c-9353-2315a6c82ae6-kube-api-access-kfh44\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.817974 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed88e38f-cb35-4072-8f9f-1c6ab980ec03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.817986 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlt7x\" (UniqueName: \"kubernetes.io/projected/503714fd-6dcf-4b1d-8806-dd78a3e85b7f-kube-api-access-vlt7x\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.822758 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-scripts" (OuterVolumeSpecName: "scripts") pod "b7f300da-65dd-4c6e-ae4a-63b797768651" (UID: "b7f300da-65dd-4c6e-ae4a-63b797768651"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.823287 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c538b0f-23b3-440d-9775-5f33f7badfd4-logs" (OuterVolumeSpecName: "logs") pod "6c538b0f-23b3-440d-9775-5f33f7badfd4" (UID: "6c538b0f-23b3-440d-9775-5f33f7badfd4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.828531 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-logs" (OuterVolumeSpecName: "logs") pod "b7f300da-65dd-4c6e-ae4a-63b797768651" (UID: "b7f300da-65dd-4c6e-ae4a-63b797768651"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.829786 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7f300da-65dd-4c6e-ae4a-63b797768651-kube-api-access-kbtvd" (OuterVolumeSpecName: "kube-api-access-kbtvd") pod "b7f300da-65dd-4c6e-ae4a-63b797768651" (UID: "b7f300da-65dd-4c6e-ae4a-63b797768651"). InnerVolumeSpecName "kube-api-access-kbtvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.833158 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "b7f300da-65dd-4c6e-ae4a-63b797768651" (UID: "b7f300da-65dd-4c6e-ae4a-63b797768651"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.851935 4780 scope.go:117] "RemoveContainer" containerID="aae1a7e720cb23ff6cab4d895d4d7d7fe47acc5b243d3c4f6eaa4b6fe46a9e00" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.852105 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6c538b0f-23b3-440d-9775-5f33f7badfd4" (UID: "6c538b0f-23b3-440d-9775-5f33f7badfd4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.852239 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c538b0f-23b3-440d-9775-5f33f7badfd4-kube-api-access-grl7p" (OuterVolumeSpecName: "kube-api-access-grl7p") pod "6c538b0f-23b3-440d-9775-5f33f7badfd4" (UID: "6c538b0f-23b3-440d-9775-5f33f7badfd4"). InnerVolumeSpecName "kube-api-access-grl7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.853537 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.867986 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7f300da-65dd-4c6e-ae4a-63b797768651" (UID: "b7f300da-65dd-4c6e-ae4a-63b797768651"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.870062 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-config-data" (OuterVolumeSpecName: "config-data") pod "e14f7a20-d45e-4662-b0db-4af394c7daed" (UID: "e14f7a20-d45e-4662-b0db-4af394c7daed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.881138 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c538b0f-23b3-440d-9775-5f33f7badfd4" (UID: "6c538b0f-23b3-440d-9775-5f33f7badfd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.905170 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919699 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919726 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919736 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919744 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c538b0f-23b3-440d-9775-5f33f7badfd4-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919754 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919780 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919789 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919797 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbtvd\" (UniqueName: \"kubernetes.io/projected/b7f300da-65dd-4c6e-ae4a-63b797768651-kube-api-access-kbtvd\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919807 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14f7a20-d45e-4662-b0db-4af394c7daed-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919816 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grl7p\" (UniqueName: \"kubernetes.io/projected/6c538b0f-23b3-440d-9775-5f33f7badfd4-kube-api-access-grl7p\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.919824 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7f300da-65dd-4c6e-ae4a-63b797768651-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.934788 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.939816 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron33e2-account-delete-nr86j" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.977207 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.990899 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.993879 4780 generic.go:334] "Generic (PLEG): container finished" podID="e42e5bce-9395-4758-8121-35408b6df2e2" containerID="a13ab8e97bfc1c433e41ba1fdbdc614073a33b3747ee1e7b9e9cd3cb214ce595" exitCode=0 Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.993911 4780 generic.go:334] "Generic (PLEG): container finished" podID="e42e5bce-9395-4758-8121-35408b6df2e2" containerID="f88db872bb531d67943f47affb487b5a77c5ff64bdf19d2564052e453ae34187" exitCode=0 Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.993954 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerDied","Data":"a13ab8e97bfc1c433e41ba1fdbdc614073a33b3747ee1e7b9e9cd3cb214ce595"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.994068 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerDied","Data":"f88db872bb531d67943f47affb487b5a77c5ff64bdf19d2564052e453ae34187"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.995932 4780 generic.go:334] "Generic (PLEG): container finished" podID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerID="724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c" exitCode=0 Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.995980 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7f300da-65dd-4c6e-ae4a-63b797768651","Type":"ContainerDied","Data":"724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.995998 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7f300da-65dd-4c6e-ae4a-63b797768651","Type":"ContainerDied","Data":"b393fe7fa56031a5a162265613befde6b421273597cc8db97ff1d05b2ec32411"} Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.996014 4780 scope.go:117] "RemoveContainer" containerID="724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c" Sep 29 19:05:17 crc kubenswrapper[4780]: I0929 19:05:17.996155 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.006145 4780 generic.go:334] "Generic (PLEG): container finished" podID="6105150b-678d-4925-a981-9a0d75377f32" containerID="0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec" exitCode=0 Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.006209 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f568c9c76-zb5pj" event={"ID":"6105150b-678d-4925-a981-9a0d75377f32","Type":"ContainerDied","Data":"0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.006236 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f568c9c76-zb5pj" event={"ID":"6105150b-678d-4925-a981-9a0d75377f32","Type":"ContainerDied","Data":"1a3ce292de08a638eb8004e833c82b814d95c0296dd191f4aaf9b9bcfadee3a7"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.006304 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f568c9c76-zb5pj" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.008642 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancefda4-account-delete-4ndzp" event={"ID":"f271f9ca-bced-4144-b779-06e7422d9a63","Type":"ContainerStarted","Data":"ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.008749 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glancefda4-account-delete-4ndzp" podUID="f271f9ca-bced-4144-b779-06e7422d9a63" containerName="mariadb-account-delete" containerID="cri-o://ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c" gracePeriod=30 Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.011020 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e14f7a20-d45e-4662-b0db-4af394c7daed","Type":"ContainerDied","Data":"dfe968157267fe3b8b2e7d077102925efda37cf4ef744e99b8e1bdfe9da91a6e"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.011154 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.017114 4780 generic.go:334] "Generic (PLEG): container finished" podID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerID="7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1" exitCode=0 Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.017214 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248","Type":"ContainerDied","Data":"7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.017262 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248","Type":"ContainerDied","Data":"18d14c63a50f5d576243af21f472ea4e7691a4771c90ea3fd9fda8316001056f"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.017342 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.020845 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.020893 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-erlang-cookie\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.020946 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-plugins-conf\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.020972 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-confd\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.020990 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-combined-ca-bundle\") pod \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021012 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b90472c3-a09d-433c-922b-d164a11636e6-erlang-cookie-secret\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021062 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b90472c3-a09d-433c-922b-d164a11636e6-pod-info\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021114 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-server-conf\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021131 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-tls\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021195 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qjvh\" (UniqueName: \"kubernetes.io/projected/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-kube-api-access-9qjvh\") pod \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021224 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phdxh\" (UniqueName: \"kubernetes.io/projected/83f061df-a5ff-4db1-b87f-4106a5e56b55-kube-api-access-phdxh\") pod \"83f061df-a5ff-4db1-b87f-4106a5e56b55\" (UID: \"83f061df-a5ff-4db1-b87f-4106a5e56b55\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021266 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-plugins\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021345 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mkcg\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-kube-api-access-9mkcg\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021375 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b90472c3-a09d-433c-922b-d164a11636e6\" (UID: \"b90472c3-a09d-433c-922b-d164a11636e6\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021426 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-config-data\") pod \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021441 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-nova-metadata-tls-certs\") pod \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.021462 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-logs\") pod \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\" (UID: \"792eb9b5-5b6a-4c61-bc3f-8ab53d64a248\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.023729 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-logs" (OuterVolumeSpecName: "logs") pod "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" (UID: "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.024213 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.030751 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.031572 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.041548 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-kube-api-access-9mkcg" (OuterVolumeSpecName: "kube-api-access-9mkcg") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "kube-api-access-9mkcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.041552 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f061df-a5ff-4db1-b87f-4106a5e56b55-kube-api-access-phdxh" (OuterVolumeSpecName: "kube-api-access-phdxh") pod "83f061df-a5ff-4db1-b87f-4106a5e56b55" (UID: "83f061df-a5ff-4db1-b87f-4106a5e56b55"). InnerVolumeSpecName "kube-api-access-phdxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.042637 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-kube-api-access-9qjvh" (OuterVolumeSpecName: "kube-api-access-9qjvh") pod "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" (UID: "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248"). InnerVolumeSpecName "kube-api-access-9qjvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.042735 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.046672 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.050547 4780 generic.go:334] "Generic (PLEG): container finished" podID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerID="968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c" exitCode=0 Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.050639 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6764d576f6-q7trv" event={"ID":"6c538b0f-23b3-440d-9775-5f33f7badfd4","Type":"ContainerDied","Data":"968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.050665 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6764d576f6-q7trv" event={"ID":"6c538b0f-23b3-440d-9775-5f33f7badfd4","Type":"ContainerDied","Data":"4f7a00a71bb47ebc0f497edccce28d5b462dfb6f910a3b73f73c4fdced01a9ce"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.050730 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6764d576f6-q7trv" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.051278 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data" (OuterVolumeSpecName: "config-data") pod "6c538b0f-23b3-440d-9775-5f33f7badfd4" (UID: "6c538b0f-23b3-440d-9775-5f33f7badfd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.065244 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6c538b0f-23b3-440d-9775-5f33f7badfd4" (UID: "6c538b0f-23b3-440d-9775-5f33f7badfd4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.065334 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90472c3-a09d-433c-922b-d164a11636e6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.067766 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6c538b0f-23b3-440d-9775-5f33f7badfd4" (UID: "6c538b0f-23b3-440d-9775-5f33f7badfd4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.068125 4780 scope.go:117] "RemoveContainer" containerID="02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.068155 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b90472c3-a09d-433c-922b-d164a11636e6-pod-info" (OuterVolumeSpecName: "pod-info") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.073191 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.092425 4780 generic.go:334] "Generic (PLEG): container finished" podID="d2ee2741-9417-4698-b550-7c596d00d271" containerID="a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac" exitCode=0 Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.092527 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ee2741-9417-4698-b550-7c596d00d271","Type":"ContainerDied","Data":"a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.092554 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d2ee2741-9417-4698-b550-7c596d00d271","Type":"ContainerDied","Data":"8c1f691cc3507568b334ffc02e35c0ed5e3ed1451e483fdb1eb306977a343ce1"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.092608 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.108409 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-config-data" (OuterVolumeSpecName: "config-data") pod "b7f300da-65dd-4c6e-ae4a-63b797768651" (UID: "b7f300da-65dd-4c6e-ae4a-63b797768651"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.112899 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-config-data" (OuterVolumeSpecName: "config-data") pod "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" (UID: "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.126765 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-confd\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.131263 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-server-conf\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.131370 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-internal-tls-certs\") pod \"6105150b-678d-4925-a981-9a0d75377f32\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.131443 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-plugins-conf\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.131482 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-config-data\") pod \"6105150b-678d-4925-a981-9a0d75377f32\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.131500 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.131537 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.131576 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-plugins\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.131610 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxl4c\" (UniqueName: \"kubernetes.io/projected/6105150b-678d-4925-a981-9a0d75377f32-kube-api-access-zxl4c\") pod \"6105150b-678d-4925-a981-9a0d75377f32\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.131631 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-public-tls-certs\") pod \"6105150b-678d-4925-a981-9a0d75377f32\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.132069 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-scripts\") pod \"6105150b-678d-4925-a981-9a0d75377f32\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.132096 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6105150b-678d-4925-a981-9a0d75377f32-logs\") pod \"6105150b-678d-4925-a981-9a0d75377f32\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.132118 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-combined-ca-bundle\") pod \"6105150b-678d-4925-a981-9a0d75377f32\" (UID: \"6105150b-678d-4925-a981-9a0d75377f32\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.132137 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwrc8\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-kube-api-access-qwrc8\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.132159 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ee2741-9417-4698-b550-7c596d00d271-pod-info\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.132186 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-tls\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.132204 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-erlang-cookie\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.132224 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ee2741-9417-4698-b550-7c596d00d271-erlang-cookie-secret\") pod \"d2ee2741-9417-4698-b550-7c596d00d271\" (UID: \"d2ee2741-9417-4698-b550-7c596d00d271\") " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.134920 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.137442 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"02521078-2e58-4ce2-bc12-0b6c3b2ed878","Type":"ContainerDied","Data":"226241802adc1b66e53d945d547c153f613a5ab0f03dba72a8f016df1608204b"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.137559 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138514 4780 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138544 4780 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b90472c3-a09d-433c-922b-d164a11636e6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138557 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138568 4780 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b90472c3-a09d-433c-922b-d164a11636e6-pod-info\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138578 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138589 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qjvh\" (UniqueName: \"kubernetes.io/projected/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-kube-api-access-9qjvh\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138599 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phdxh\" (UniqueName: \"kubernetes.io/projected/83f061df-a5ff-4db1-b87f-4106a5e56b55-kube-api-access-phdxh\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138609 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138617 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138627 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138637 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mkcg\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-kube-api-access-9mkcg\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138663 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138673 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138682 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138691 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c538b0f-23b3-440d-9775-5f33f7badfd4-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138699 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138708 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.138718 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.142916 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.143080 4780 scope.go:117] "RemoveContainer" containerID="724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c" Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.149943 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c\": container with ID starting with 724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c not found: ID does not exist" containerID="724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.149986 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c"} err="failed to get container status \"724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c\": rpc error: code = NotFound desc = could not find container \"724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c\": container with ID starting with 724a2b1cd5960a5e1f086c2b9e475ff945f6861e8245bcee56ab187bcc2f427c not found: ID does not exist" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.150038 4780 scope.go:117] "RemoveContainer" containerID="02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.151192 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.151624 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6105150b-678d-4925-a981-9a0d75377f32-logs" (OuterVolumeSpecName: "logs") pod "6105150b-678d-4925-a981-9a0d75377f32" (UID: "6105150b-678d-4925-a981-9a0d75377f32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.154321 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glancefda4-account-delete-4ndzp" podStartSLOduration=8.15429387 podStartE2EDuration="8.15429387s" podCreationTimestamp="2025-09-29 19:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 19:05:18.125272879 +0000 UTC m=+1318.073570943" watchObservedRunningTime="2025-09-29 19:05:18.15429387 +0000 UTC m=+1318.102591914" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.158468 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data" (OuterVolumeSpecName: "config-data") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.164140 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.170398 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.183306 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150\": container with ID starting with 02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150 not found: ID does not exist" containerID="02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.183356 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150"} err="failed to get container status \"02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150\": rpc error: code = NotFound desc = could not find container \"02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150\": container with ID starting with 02918751636aff4418d41051495955357c3c4593eaf6a4184652ae7600897150 not found: ID does not exist" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.183386 4780 scope.go:117] "RemoveContainer" containerID="0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.213367 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d2ee2741-9417-4698-b550-7c596d00d271-pod-info" (OuterVolumeSpecName: "pod-info") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.213395 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.220193 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ee2741-9417-4698-b550-7c596d00d271-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.224266 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6105150b-678d-4925-a981-9a0d75377f32-kube-api-access-zxl4c" (OuterVolumeSpecName: "kube-api-access-zxl4c") pod "6105150b-678d-4925-a981-9a0d75377f32" (UID: "6105150b-678d-4925-a981-9a0d75377f32"). InnerVolumeSpecName "kube-api-access-zxl4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.224385 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.224605 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-kube-api-access-qwrc8" (OuterVolumeSpecName: "kube-api-access-qwrc8") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "kube-api-access-qwrc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.224645 4780 generic.go:334] "Generic (PLEG): container finished" podID="ed88e38f-cb35-4072-8f9f-1c6ab980ec03" containerID="64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a" exitCode=2 Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.224741 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ed88e38f-cb35-4072-8f9f-1c6ab980ec03","Type":"ContainerDied","Data":"64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.224774 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ed88e38f-cb35-4072-8f9f-1c6ab980ec03","Type":"ContainerDied","Data":"a5efc05caf3f3f05d91017d190edd659da437fd2cdab923cd83902f41b32ce6d"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.224836 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240379 4780 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240434 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240447 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240458 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxl4c\" (UniqueName: \"kubernetes.io/projected/6105150b-678d-4925-a981-9a0d75377f32-kube-api-access-zxl4c\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240474 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6105150b-678d-4925-a981-9a0d75377f32-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240488 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwrc8\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-kube-api-access-qwrc8\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240498 4780 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2ee2741-9417-4698-b550-7c596d00d271-pod-info\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240509 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240521 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.240532 4780 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2ee2741-9417-4698-b550-7c596d00d271-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.264611 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi0c9e-account-delete-l4t5r" event={"ID":"622d766f-f43c-434c-9353-2315a6c82ae6","Type":"ContainerDied","Data":"cacf127956e62910a5b82c40d040fb3d978a2f3b2ee90cad223789607a1d6831"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.264740 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi0c9e-account-delete-l4t5r" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.269199 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-scripts" (OuterVolumeSpecName: "scripts") pod "6105150b-678d-4925-a981-9a0d75377f32" (UID: "6105150b-678d-4925-a981-9a0d75377f32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.272324 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementb644-account-delete-t2p8t" event={"ID":"503714fd-6dcf-4b1d-8806-dd78a3e85b7f","Type":"ContainerDied","Data":"4895d72a5fe414b04d0bbbd73c376248068817a47d2969abfb8fe95e9c93bd75"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.272952 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementb644-account-delete-t2p8t" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.298927 4780 generic.go:334] "Generic (PLEG): container finished" podID="b90472c3-a09d-433c-922b-d164a11636e6" containerID="0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0" exitCode=0 Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.299080 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.300733 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b90472c3-a09d-433c-922b-d164a11636e6","Type":"ContainerDied","Data":"0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.300775 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b90472c3-a09d-433c-922b-d164a11636e6","Type":"ContainerDied","Data":"4eba6f78ef2f0e7a082b6b36ad455efbe29837115d31d76feb7d9fa569257107"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.304886 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.309411 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell063ac-account-delete-rvt6m" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.315767 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron33e2-account-delete-nr86j" event={"ID":"83f061df-a5ff-4db1-b87f-4106a5e56b55","Type":"ContainerDied","Data":"9276d955249f1f3bcb322a56e077252cb95db4ceccbb258e14cb001bb0637021"} Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.315907 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron33e2-account-delete-nr86j" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.316169 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican32a4-account-delete-xckl5" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.320624 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58b5d8cc69-dbww7" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.344850 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.362147 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.372704 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" (UID: "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.380229 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b7f300da-65dd-4c6e-ae4a-63b797768651" (UID: "b7f300da-65dd-4c6e-ae4a-63b797768651"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.429842 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" (UID: "792eb9b5-5b6a-4c61-bc3f-8ab53d64a248"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.466419 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-server-conf" (OuterVolumeSpecName: "server-conf") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.473224 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.504454 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data" (OuterVolumeSpecName: "config-data") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.514991 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7f300da-65dd-4c6e-ae4a-63b797768651-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.516617 4780 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.516656 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.516674 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.516689 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.516703 4780 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2ee2741-9417-4698-b550-7c596d00d271-server-conf\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.535135 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-config-data" (OuterVolumeSpecName: "config-data") pod "6105150b-678d-4925-a981-9a0d75377f32" (UID: "6105150b-678d-4925-a981-9a0d75377f32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.542571 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6105150b-678d-4925-a981-9a0d75377f32" (UID: "6105150b-678d-4925-a981-9a0d75377f32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.564902 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.565276 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-server-conf" (OuterVolumeSpecName: "server-conf") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.598337 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d2ee2741-9417-4698-b550-7c596d00d271" (UID: "d2ee2741-9417-4698-b550-7c596d00d271"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.612381 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b90472c3-a09d-433c-922b-d164a11636e6" (UID: "b90472c3-a09d-433c-922b-d164a11636e6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.626705 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.626732 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.626764 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b90472c3-a09d-433c-922b-d164a11636e6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.626779 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.626791 4780 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b90472c3-a09d-433c-922b-d164a11636e6-server-conf\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.626802 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2ee2741-9417-4698-b550-7c596d00d271-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.681924 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6105150b-678d-4925-a981-9a0d75377f32" (UID: "6105150b-678d-4925-a981-9a0d75377f32"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.711283 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6105150b-678d-4925-a981-9a0d75377f32" (UID: "6105150b-678d-4925-a981-9a0d75377f32"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.722652 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.724209 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.728335 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.728490 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="ovn-northd" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.730095 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.730134 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6105150b-678d-4925-a981-9a0d75377f32-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.735910 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.736183 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.736407 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.736438 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.738712 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.762766 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.765063 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" path="/var/lib/kubelet/pods/02521078-2e58-4ce2-bc12-0b6c3b2ed878/volumes" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.765745 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f614b85-1709-4020-87c7-c349da7de2c8" path="/var/lib/kubelet/pods/3f614b85-1709-4020-87c7-c349da7de2c8/volumes" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.766281 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1e36a5-f7ff-4c0b-b950-382d6123b571" path="/var/lib/kubelet/pods/5d1e36a5-f7ff-4c0b-b950-382d6123b571/volumes" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.771930 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="628b549e-6d99-43d4-94bb-61b457f4c37b" path="/var/lib/kubelet/pods/628b549e-6d99-43d4-94bb-61b457f4c37b/volumes" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.772823 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" path="/var/lib/kubelet/pods/62b9c388-0f74-42fc-bf3d-711322b976d8/volumes" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.773414 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c94ee7f-d255-4413-817d-77759b7d0e80" path="/var/lib/kubelet/pods/6c94ee7f-d255-4413-817d-77759b7d0e80/volumes" Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.773989 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:18 crc kubenswrapper[4780]: E0929 19:05:18.774073 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.774513 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e14f7a20-d45e-4662-b0db-4af394c7daed" path="/var/lib/kubelet/pods/e14f7a20-d45e-4662-b0db-4af394c7daed/volumes" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.775359 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81be594-85ee-4be4-8afd-ec5580651ec7" path="/var/lib/kubelet/pods/e81be594-85ee-4be4-8afd-ec5580651ec7/volumes" Sep 29 19:05:18 crc kubenswrapper[4780]: I0929 19:05:18.779687 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" path="/var/lib/kubelet/pods/f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d/volumes" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.341360 4780 generic.go:334] "Generic (PLEG): container finished" podID="48191511-38e9-46d2-82f8-77453769927c" containerID="f87b8bafb323301052d22ea81d2721d5221500537424fea022247a8e792a03e3" exitCode=0 Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.341540 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48191511-38e9-46d2-82f8-77453769927c","Type":"ContainerDied","Data":"f87b8bafb323301052d22ea81d2721d5221500537424fea022247a8e792a03e3"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.346111 4780 generic.go:334] "Generic (PLEG): container finished" podID="bc401926-3969-448c-9910-22572fecb168" containerID="1bf3800786032f687dfb373cbc1d24ace1919441397847f347217bf7a840db61" exitCode=0 Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.346171 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bc401926-3969-448c-9910-22572fecb168","Type":"ContainerDied","Data":"1bf3800786032f687dfb373cbc1d24ace1919441397847f347217bf7a840db61"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.346197 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bc401926-3969-448c-9910-22572fecb168","Type":"ContainerDied","Data":"c0b10ce8993705e4f4545cd5c2072ccf52ecd829746e872de9b0de3f1ef9e502"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.346215 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0b10ce8993705e4f4545cd5c2072ccf52ecd829746e872de9b0de3f1ef9e502" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.348168 4780 generic.go:334] "Generic (PLEG): container finished" podID="58ef0b7e-a06d-49a2-824e-9f088c267a97" containerID="fa44c2b6e56600dfb6c99d6fb0e419237762ff70fabe663a6e3f18eded510c50" exitCode=0 Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.348222 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"58ef0b7e-a06d-49a2-824e-9f088c267a97","Type":"ContainerDied","Data":"fa44c2b6e56600dfb6c99d6fb0e419237762ff70fabe663a6e3f18eded510c50"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.348244 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"58ef0b7e-a06d-49a2-824e-9f088c267a97","Type":"ContainerDied","Data":"1fbbeffc98f3d9d74e5977309189337422cb12fa45b0b3ab6d4f24ef62680337"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.348326 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fbbeffc98f3d9d74e5977309189337422cb12fa45b0b3ab6d4f24ef62680337" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.349909 4780 generic.go:334] "Generic (PLEG): container finished" podID="ec846e3f-c11b-4818-a15b-9f855ed48a56" containerID="06b644ef5b1ab2aed1b81290fa9144d38c32c66e7d427c70b6dfb41dd252e0ac" exitCode=0 Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.349957 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec846e3f-c11b-4818-a15b-9f855ed48a56","Type":"ContainerDied","Data":"06b644ef5b1ab2aed1b81290fa9144d38c32c66e7d427c70b6dfb41dd252e0ac"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.349977 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec846e3f-c11b-4818-a15b-9f855ed48a56","Type":"ContainerDied","Data":"f9fd512f737c293fc7bcf9992914bb134985170fe05c6a33fc8f247beb3e2550"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.349990 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9fd512f737c293fc7bcf9992914bb134985170fe05c6a33fc8f247beb3e2550" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.354088 4780 generic.go:334] "Generic (PLEG): container finished" podID="e42e5bce-9395-4758-8121-35408b6df2e2" containerID="2206fdfda1b3679c9eaab7892ccf4c32611624a3996175a4dd0502159b261a25" exitCode=0 Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.354142 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerDied","Data":"2206fdfda1b3679c9eaab7892ccf4c32611624a3996175a4dd0502159b261a25"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.354279 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e5bce-9395-4758-8121-35408b6df2e2","Type":"ContainerDied","Data":"edeb186821bfae5ae8de8319bad00b82bcd3b508d07041d3d8663c396821a5e0"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.354303 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edeb186821bfae5ae8de8319bad00b82bcd3b508d07041d3d8663c396821a5e0" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.356563 4780 generic.go:334] "Generic (PLEG): container finished" podID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerID="b7c50fc2d9534221112ba9758fee8b52356d9efe5f5ed3fb8c0432498719f180" exitCode=0 Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.356616 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74988cff4c-fmczd" event={"ID":"8150bb34-1bc0-4c45-92f8-9d8d04f611e3","Type":"ContainerDied","Data":"b7c50fc2d9534221112ba9758fee8b52356d9efe5f5ed3fb8c0432498719f180"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.356639 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74988cff4c-fmczd" event={"ID":"8150bb34-1bc0-4c45-92f8-9d8d04f611e3","Type":"ContainerDied","Data":"2ed2f9a0db4527a7eb67b707e992fa96d67a3759ccc3aab3d8b5da1a78ed0e75"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.356776 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ed2f9a0db4527a7eb67b707e992fa96d67a3759ccc3aab3d8b5da1a78ed0e75" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.392158 4780 generic.go:334] "Generic (PLEG): container finished" podID="8e1d2b75-0893-468d-8365-f08fa8875575" containerID="a3bd0c44347129dd3eb6a433ef0ca6e0cd25372b1057012a51a9c55da4d8ff4a" exitCode=0 Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.392267 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" event={"ID":"8e1d2b75-0893-468d-8365-f08fa8875575","Type":"ContainerDied","Data":"a3bd0c44347129dd3eb6a433ef0ca6e0cd25372b1057012a51a9c55da4d8ff4a"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.398703 4780 generic.go:334] "Generic (PLEG): container finished" podID="aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" containerID="739143154f41eccfb13a2b48adb19e687f9f167c8167b59c2ccf652c349ef90e" exitCode=0 Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.398756 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3","Type":"ContainerDied","Data":"739143154f41eccfb13a2b48adb19e687f9f167c8167b59c2ccf652c349ef90e"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.398791 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3","Type":"ContainerDied","Data":"e492c7360b9e5e03757464a55f0446e2e9be786e11bc83cef2cd125e0c612676"} Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.398804 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e492c7360b9e5e03757464a55f0446e2e9be786e11bc83cef2cd125e0c612676" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.408577 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.413315 4780 scope.go:117] "RemoveContainer" containerID="eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.433428 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.462764 4780 scope.go:117] "RemoveContainer" containerID="0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec" Sep 29 19:05:19 crc kubenswrapper[4780]: E0929 19:05:19.463667 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec\": container with ID starting with 0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec not found: ID does not exist" containerID="0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.463723 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec"} err="failed to get container status \"0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec\": rpc error: code = NotFound desc = could not find container \"0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec\": container with ID starting with 0d2806d2b3924dc7a81d4bfd75c1503e1e445633e9af779f41abb77bd2f744ec not found: ID does not exist" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.463753 4780 scope.go:117] "RemoveContainer" containerID="eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1" Sep 29 19:05:19 crc kubenswrapper[4780]: E0929 19:05:19.464282 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1\": container with ID starting with eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1 not found: ID does not exist" containerID="eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.464315 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1"} err="failed to get container status \"eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1\": rpc error: code = NotFound desc = could not find container \"eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1\": container with ID starting with eb92b65589bdbe3397a60fd5a2bb2923d2180e9f9a791f34bf8dd85d99bed5b1 not found: ID does not exist" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.464337 4780 scope.go:117] "RemoveContainer" containerID="eeee69b0a809e51c2de8aae84184f344369f4e4f6fab7ebfa4f65f602565ed13" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.490810 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.529989 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.534902 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.537369 4780 scope.go:117] "RemoveContainer" containerID="22c83df1dfa900462fb0bdf93010df94b1f1fbd660599f3ce6a52119f57afbe9" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.539756 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw8gs\" (UniqueName: \"kubernetes.io/projected/58ef0b7e-a06d-49a2-824e-9f088c267a97-kube-api-access-tw8gs\") pod \"58ef0b7e-a06d-49a2-824e-9f088c267a97\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.539817 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-sg-core-conf-yaml\") pod \"e42e5bce-9395-4758-8121-35408b6df2e2\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.539860 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-combined-ca-bundle\") pod \"58ef0b7e-a06d-49a2-824e-9f088c267a97\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540062 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-scripts\") pod \"e42e5bce-9395-4758-8121-35408b6df2e2\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540113 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-kolla-config\") pod \"58ef0b7e-a06d-49a2-824e-9f088c267a97\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540134 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m28bl\" (UniqueName: \"kubernetes.io/projected/e42e5bce-9395-4758-8121-35408b6df2e2-kube-api-access-m28bl\") pod \"e42e5bce-9395-4758-8121-35408b6df2e2\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540202 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-log-httpd\") pod \"e42e5bce-9395-4758-8121-35408b6df2e2\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540240 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-combined-ca-bundle\") pod \"e42e5bce-9395-4758-8121-35408b6df2e2\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540277 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-memcached-tls-certs\") pod \"58ef0b7e-a06d-49a2-824e-9f088c267a97\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540322 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-ceilometer-tls-certs\") pod \"e42e5bce-9395-4758-8121-35408b6df2e2\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540355 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-config-data\") pod \"58ef0b7e-a06d-49a2-824e-9f088c267a97\" (UID: \"58ef0b7e-a06d-49a2-824e-9f088c267a97\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540386 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-run-httpd\") pod \"e42e5bce-9395-4758-8121-35408b6df2e2\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.540413 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-config-data\") pod \"e42e5bce-9395-4758-8121-35408b6df2e2\" (UID: \"e42e5bce-9395-4758-8121-35408b6df2e2\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.541573 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e42e5bce-9395-4758-8121-35408b6df2e2" (UID: "e42e5bce-9395-4758-8121-35408b6df2e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.542279 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "58ef0b7e-a06d-49a2-824e-9f088c267a97" (UID: "58ef0b7e-a06d-49a2-824e-9f088c267a97"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.542337 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-config-data" (OuterVolumeSpecName: "config-data") pod "58ef0b7e-a06d-49a2-824e-9f088c267a97" (UID: "58ef0b7e-a06d-49a2-824e-9f088c267a97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.542677 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e42e5bce-9395-4758-8121-35408b6df2e2" (UID: "e42e5bce-9395-4758-8121-35408b6df2e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.559787 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-scripts" (OuterVolumeSpecName: "scripts") pod "e42e5bce-9395-4758-8121-35408b6df2e2" (UID: "e42e5bce-9395-4758-8121-35408b6df2e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.560363 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ef0b7e-a06d-49a2-824e-9f088c267a97-kube-api-access-tw8gs" (OuterVolumeSpecName: "kube-api-access-tw8gs") pod "58ef0b7e-a06d-49a2-824e-9f088c267a97" (UID: "58ef0b7e-a06d-49a2-824e-9f088c267a97"). InnerVolumeSpecName "kube-api-access-tw8gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.565979 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42e5bce-9395-4758-8121-35408b6df2e2-kube-api-access-m28bl" (OuterVolumeSpecName: "kube-api-access-m28bl") pod "e42e5bce-9395-4758-8121-35408b6df2e2" (UID: "e42e5bce-9395-4758-8121-35408b6df2e2"). InnerVolumeSpecName "kube-api-access-m28bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.587924 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e42e5bce-9395-4758-8121-35408b6df2e2" (UID: "e42e5bce-9395-4758-8121-35408b6df2e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.597795 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58ef0b7e-a06d-49a2-824e-9f088c267a97" (UID: "58ef0b7e-a06d-49a2-824e-9f088c267a97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.629028 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "58ef0b7e-a06d-49a2-824e-9f088c267a97" (UID: "58ef0b7e-a06d-49a2-824e-9f088c267a97"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.639442 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e42e5bce-9395-4758-8121-35408b6df2e2" (UID: "e42e5bce-9395-4758-8121-35408b6df2e2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.641928 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92w8q\" (UniqueName: \"kubernetes.io/projected/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-kube-api-access-92w8q\") pod \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.641978 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-logs\") pod \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.642005 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxfxd\" (UniqueName: \"kubernetes.io/projected/ec846e3f-c11b-4818-a15b-9f855ed48a56-kube-api-access-hxfxd\") pod \"ec846e3f-c11b-4818-a15b-9f855ed48a56\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.642417 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-config-data\") pod \"ec846e3f-c11b-4818-a15b-9f855ed48a56\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.642526 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data\") pod \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.642599 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data-custom\") pod \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.642638 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxdms\" (UniqueName: \"kubernetes.io/projected/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-kube-api-access-pxdms\") pod \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.642698 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-combined-ca-bundle\") pod \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\" (UID: \"8150bb34-1bc0-4c45-92f8-9d8d04f611e3\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.642731 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-combined-ca-bundle\") pod \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.642763 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-config-data\") pod \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\" (UID: \"aa6b4d2f-2f81-44fd-8c76-2aa6204209c3\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.642860 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-combined-ca-bundle\") pod \"ec846e3f-c11b-4818-a15b-9f855ed48a56\" (UID: \"ec846e3f-c11b-4818-a15b-9f855ed48a56\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643497 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw8gs\" (UniqueName: \"kubernetes.io/projected/58ef0b7e-a06d-49a2-824e-9f088c267a97-kube-api-access-tw8gs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643519 4780 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643533 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643547 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643558 4780 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-kolla-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643572 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m28bl\" (UniqueName: \"kubernetes.io/projected/e42e5bce-9395-4758-8121-35408b6df2e2-kube-api-access-m28bl\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643583 4780 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643593 4780 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ef0b7e-a06d-49a2-824e-9f088c267a97-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643605 4780 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643619 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58ef0b7e-a06d-49a2-824e-9f088c267a97-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.643630 4780 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e5bce-9395-4758-8121-35408b6df2e2-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.646671 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-logs" (OuterVolumeSpecName: "logs") pod "8150bb34-1bc0-4c45-92f8-9d8d04f611e3" (UID: "8150bb34-1bc0-4c45-92f8-9d8d04f611e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.647929 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec846e3f-c11b-4818-a15b-9f855ed48a56-kube-api-access-hxfxd" (OuterVolumeSpecName: "kube-api-access-hxfxd") pod "ec846e3f-c11b-4818-a15b-9f855ed48a56" (UID: "ec846e3f-c11b-4818-a15b-9f855ed48a56"). InnerVolumeSpecName "kube-api-access-hxfxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.648725 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-kube-api-access-92w8q" (OuterVolumeSpecName: "kube-api-access-92w8q") pod "aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" (UID: "aa6b4d2f-2f81-44fd-8c76-2aa6204209c3"). InnerVolumeSpecName "kube-api-access-92w8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.665717 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8150bb34-1bc0-4c45-92f8-9d8d04f611e3" (UID: "8150bb34-1bc0-4c45-92f8-9d8d04f611e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.670619 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-kube-api-access-pxdms" (OuterVolumeSpecName: "kube-api-access-pxdms") pod "8150bb34-1bc0-4c45-92f8-9d8d04f611e3" (UID: "8150bb34-1bc0-4c45-92f8-9d8d04f611e3"). InnerVolumeSpecName "kube-api-access-pxdms". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.677074 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e42e5bce-9395-4758-8121-35408b6df2e2" (UID: "e42e5bce-9395-4758-8121-35408b6df2e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.680640 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8150bb34-1bc0-4c45-92f8-9d8d04f611e3" (UID: "8150bb34-1bc0-4c45-92f8-9d8d04f611e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.684550 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-config-data" (OuterVolumeSpecName: "config-data") pod "e42e5bce-9395-4758-8121-35408b6df2e2" (UID: "e42e5bce-9395-4758-8121-35408b6df2e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.702242 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-config-data" (OuterVolumeSpecName: "config-data") pod "ec846e3f-c11b-4818-a15b-9f855ed48a56" (UID: "ec846e3f-c11b-4818-a15b-9f855ed48a56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.702424 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-config-data" (OuterVolumeSpecName: "config-data") pod "aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" (UID: "aa6b4d2f-2f81-44fd-8c76-2aa6204209c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.703701 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec846e3f-c11b-4818-a15b-9f855ed48a56" (UID: "ec846e3f-c11b-4818-a15b-9f855ed48a56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.705167 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.713457 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.735302 4780 scope.go:117] "RemoveContainer" containerID="7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745095 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745122 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745131 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745141 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745152 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92w8q\" (UniqueName: \"kubernetes.io/projected/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-kube-api-access-92w8q\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745162 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745172 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxfxd\" (UniqueName: \"kubernetes.io/projected/ec846e3f-c11b-4818-a15b-9f855ed48a56-kube-api-access-hxfxd\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745180 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec846e3f-c11b-4818-a15b-9f855ed48a56-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745188 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745196 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e5bce-9395-4758-8121-35408b6df2e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.745204 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxdms\" (UniqueName: \"kubernetes.io/projected/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-kube-api-access-pxdms\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.748038 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data" (OuterVolumeSpecName: "config-data") pod "8150bb34-1bc0-4c45-92f8-9d8d04f611e3" (UID: "8150bb34-1bc0-4c45-92f8-9d8d04f611e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.780732 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.781867 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" (UID: "aa6b4d2f-2f81-44fd-8c76-2aa6204209c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.784670 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-58b5d8cc69-dbww7"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.791030 4780 scope.go:117] "RemoveContainer" containerID="a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.799768 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-58b5d8cc69-dbww7"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.818541 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi0c9e-account-delete-l4t5r"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.829083 4780 scope.go:117] "RemoveContainer" containerID="7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1" Sep 29 19:05:19 crc kubenswrapper[4780]: E0929 19:05:19.835026 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1\": container with ID starting with 7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1 not found: ID does not exist" containerID="7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.835102 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1"} err="failed to get container status \"7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1\": rpc error: code = NotFound desc = could not find container \"7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1\": container with ID starting with 7dbbd881a8a3f212a8c2d207188cd2f0581e981f15a110ce34610afe0d1108f1 not found: ID does not exist" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.835131 4780 scope.go:117] "RemoveContainer" containerID="a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a" Sep 29 19:05:19 crc kubenswrapper[4780]: E0929 19:05:19.838560 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a\": container with ID starting with a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a not found: ID does not exist" containerID="a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.838632 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a"} err="failed to get container status \"a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a\": rpc error: code = NotFound desc = could not find container \"a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a\": container with ID starting with a2cec310dc2e759b1ff1bbaac1cc61867cc1b16788932717c79cdb1d18a7a04a not found: ID does not exist" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.838679 4780 scope.go:117] "RemoveContainer" containerID="968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.842939 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi0c9e-account-delete-l4t5r"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.846249 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-combined-ca-bundle\") pod \"8e1d2b75-0893-468d-8365-f08fa8875575\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.846301 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e1d2b75-0893-468d-8365-f08fa8875575-logs\") pod \"8e1d2b75-0893-468d-8365-f08fa8875575\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.846345 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-config-data\") pod \"bc401926-3969-448c-9910-22572fecb168\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.846373 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghlcg\" (UniqueName: \"kubernetes.io/projected/bc401926-3969-448c-9910-22572fecb168-kube-api-access-ghlcg\") pod \"bc401926-3969-448c-9910-22572fecb168\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.846415 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-combined-ca-bundle\") pod \"bc401926-3969-448c-9910-22572fecb168\" (UID: \"bc401926-3969-448c-9910-22572fecb168\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.846487 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data-custom\") pod \"8e1d2b75-0893-468d-8365-f08fa8875575\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.846594 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmxw5\" (UniqueName: \"kubernetes.io/projected/8e1d2b75-0893-468d-8365-f08fa8875575-kube-api-access-kmxw5\") pod \"8e1d2b75-0893-468d-8365-f08fa8875575\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.846629 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data\") pod \"8e1d2b75-0893-468d-8365-f08fa8875575\" (UID: \"8e1d2b75-0893-468d-8365-f08fa8875575\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.846850 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1d2b75-0893-468d-8365-f08fa8875575-logs" (OuterVolumeSpecName: "logs") pod "8e1d2b75-0893-468d-8365-f08fa8875575" (UID: "8e1d2b75-0893-468d-8365-f08fa8875575"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.847218 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8150bb34-1bc0-4c45-92f8-9d8d04f611e3-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.847238 4780 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e1d2b75-0893-468d-8365-f08fa8875575-logs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.847248 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.860999 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc401926-3969-448c-9910-22572fecb168-kube-api-access-ghlcg" (OuterVolumeSpecName: "kube-api-access-ghlcg") pod "bc401926-3969-448c-9910-22572fecb168" (UID: "bc401926-3969-448c-9910-22572fecb168"). InnerVolumeSpecName "kube-api-access-ghlcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.861137 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8e1d2b75-0893-468d-8365-f08fa8875575" (UID: "8e1d2b75-0893-468d-8365-f08fa8875575"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.865930 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.866219 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1d2b75-0893-468d-8365-f08fa8875575-kube-api-access-kmxw5" (OuterVolumeSpecName: "kube-api-access-kmxw5") pod "8e1d2b75-0893-468d-8365-f08fa8875575" (UID: "8e1d2b75-0893-468d-8365-f08fa8875575"). InnerVolumeSpecName "kube-api-access-kmxw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.882893 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc401926-3969-448c-9910-22572fecb168" (UID: "bc401926-3969-448c-9910-22572fecb168"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.890503 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-config-data" (OuterVolumeSpecName: "config-data") pod "bc401926-3969-448c-9910-22572fecb168" (UID: "bc401926-3969-448c-9910-22572fecb168"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.891108 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.896397 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e1d2b75-0893-468d-8365-f08fa8875575" (UID: "8e1d2b75-0893-468d-8365-f08fa8875575"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.908134 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5f568c9c76-zb5pj"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.919645 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5f568c9c76-zb5pj"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.928565 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.938251 4780 scope.go:117] "RemoveContainer" containerID="0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.941161 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data" (OuterVolumeSpecName: "config-data") pod "8e1d2b75-0893-468d-8365-f08fa8875575" (UID: "8e1d2b75-0893-468d-8365-f08fa8875575"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.944337 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.951546 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-kolla-config\") pod \"48191511-38e9-46d2-82f8-77453769927c\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.951616 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/48191511-38e9-46d2-82f8-77453769927c-config-data-generated\") pod \"48191511-38e9-46d2-82f8-77453769927c\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.951679 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-secrets\") pod \"48191511-38e9-46d2-82f8-77453769927c\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.951733 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"48191511-38e9-46d2-82f8-77453769927c\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.951788 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qpg4\" (UniqueName: \"kubernetes.io/projected/48191511-38e9-46d2-82f8-77453769927c-kube-api-access-6qpg4\") pod \"48191511-38e9-46d2-82f8-77453769927c\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.951811 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-operator-scripts\") pod \"48191511-38e9-46d2-82f8-77453769927c\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.951840 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-galera-tls-certs\") pod \"48191511-38e9-46d2-82f8-77453769927c\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.951879 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-config-data-default\") pod \"48191511-38e9-46d2-82f8-77453769927c\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.951906 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-combined-ca-bundle\") pod \"48191511-38e9-46d2-82f8-77453769927c\" (UID: \"48191511-38e9-46d2-82f8-77453769927c\") " Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.952296 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmxw5\" (UniqueName: \"kubernetes.io/projected/8e1d2b75-0893-468d-8365-f08fa8875575-kube-api-access-kmxw5\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.952315 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.952327 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.952338 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.952349 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghlcg\" (UniqueName: \"kubernetes.io/projected/bc401926-3969-448c-9910-22572fecb168-kube-api-access-ghlcg\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.952360 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc401926-3969-448c-9910-22572fecb168-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.952371 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e1d2b75-0893-468d-8365-f08fa8875575-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.955234 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "48191511-38e9-46d2-82f8-77453769927c" (UID: "48191511-38e9-46d2-82f8-77453769927c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.955641 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48191511-38e9-46d2-82f8-77453769927c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "48191511-38e9-46d2-82f8-77453769927c" (UID: "48191511-38e9-46d2-82f8-77453769927c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.961394 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "48191511-38e9-46d2-82f8-77453769927c" (UID: "48191511-38e9-46d2-82f8-77453769927c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.961577 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48191511-38e9-46d2-82f8-77453769927c-kube-api-access-6qpg4" (OuterVolumeSpecName: "kube-api-access-6qpg4") pod "48191511-38e9-46d2-82f8-77453769927c" (UID: "48191511-38e9-46d2-82f8-77453769927c"). InnerVolumeSpecName "kube-api-access-6qpg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.961931 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell063ac-account-delete-rvt6m"] Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.962676 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48191511-38e9-46d2-82f8-77453769927c" (UID: "48191511-38e9-46d2-82f8-77453769927c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.965170 4780 scope.go:117] "RemoveContainer" containerID="968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c" Sep 29 19:05:19 crc kubenswrapper[4780]: E0929 19:05:19.966039 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c\": container with ID starting with 968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c not found: ID does not exist" containerID="968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.966091 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c"} err="failed to get container status \"968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c\": rpc error: code = NotFound desc = could not find container \"968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c\": container with ID starting with 968dbdafc1e38cdf05cacd62bb78c03763ce0c8831fdda0a6ba1c49d1b27961c not found: ID does not exist" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.978820 4780 scope.go:117] "RemoveContainer" containerID="0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.989372 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-secrets" (OuterVolumeSpecName: "secrets") pod "48191511-38e9-46d2-82f8-77453769927c" (UID: "48191511-38e9-46d2-82f8-77453769927c"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:19 crc kubenswrapper[4780]: E0929 19:05:19.990235 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1\": container with ID starting with 0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1 not found: ID does not exist" containerID="0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.990276 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1"} err="failed to get container status \"0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1\": rpc error: code = NotFound desc = could not find container \"0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1\": container with ID starting with 0b0ebc253fe05deda4a5d682af27b4d438941308b9b4153183332928d7d40db1 not found: ID does not exist" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.990303 4780 scope.go:117] "RemoveContainer" containerID="a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac" Sep 29 19:05:19 crc kubenswrapper[4780]: I0929 19:05:19.992591 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48191511-38e9-46d2-82f8-77453769927c" (UID: "48191511-38e9-46d2-82f8-77453769927c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.008871 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell063ac-account-delete-rvt6m"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.025361 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.026597 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3683c554-eec7-4825-8972-0445faf15a23/ovn-northd/0.log" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.026730 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.028860 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="a1f2aaf8-27dc-428c-a387-d63424889230" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.197:6080/vnc_lite.html\": context deadline exceeded" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.035611 4780 scope.go:117] "RemoveContainer" containerID="9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.035745 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.036779 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "48191511-38e9-46d2-82f8-77453769927c" (UID: "48191511-38e9-46d2-82f8-77453769927c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.042066 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "48191511-38e9-46d2-82f8-77453769927c" (UID: "48191511-38e9-46d2-82f8-77453769927c"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.043925 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican32a4-account-delete-xckl5"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.053636 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qpg4\" (UniqueName: \"kubernetes.io/projected/48191511-38e9-46d2-82f8-77453769927c-kube-api-access-6qpg4\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.053669 4780 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-operator-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.053698 4780 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.053708 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-config-data-default\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.053717 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.053728 4780 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48191511-38e9-46d2-82f8-77453769927c-kolla-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.053737 4780 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/48191511-38e9-46d2-82f8-77453769927c-config-data-generated\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.053746 4780 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/48191511-38e9-46d2-82f8-77453769927c-secrets\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.053790 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.072120 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican32a4-account-delete-xckl5"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.075717 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.080003 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6764d576f6-q7trv"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.081608 4780 scope.go:117] "RemoveContainer" containerID="a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac" Sep 29 19:05:20 crc kubenswrapper[4780]: E0929 19:05:20.082703 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac\": container with ID starting with a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac not found: ID does not exist" containerID="a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.082742 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac"} err="failed to get container status \"a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac\": rpc error: code = NotFound desc = could not find container \"a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac\": container with ID starting with a0909ba11b8c055e0a9873870330cfce4c7d1a1024c31e589522226ae4d8e3ac not found: ID does not exist" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.082772 4780 scope.go:117] "RemoveContainer" containerID="9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3" Sep 29 19:05:20 crc kubenswrapper[4780]: E0929 19:05:20.083025 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3\": container with ID starting with 9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3 not found: ID does not exist" containerID="9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.083078 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3"} err="failed to get container status \"9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3\": rpc error: code = NotFound desc = could not find container \"9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3\": container with ID starting with 9dc1651fa4accf9f78a4c55cca09162e6b14fe9e8b6b18d3b1e283c5ae4b47d3 not found: ID does not exist" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.083093 4780 scope.go:117] "RemoveContainer" containerID="3f91861bf876fab40cbb103b723d6d21a4fb3a2aeadedbbda6de0035a6ee2aa7" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.088690 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6764d576f6-q7trv"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.097786 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron33e2-account-delete-nr86j"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.105939 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron33e2-account-delete-nr86j"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.115428 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementb644-account-delete-t2p8t"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.122405 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementb644-account-delete-t2p8t"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.123162 4780 scope.go:117] "RemoveContainer" containerID="aae9731ab2bae2a8e2eb268dc27032c196b1db8b2299a7d349eab203f6ba9217" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.130830 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.137176 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.149669 4780 scope.go:117] "RemoveContainer" containerID="64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.156857 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-metrics-certs-tls-certs\") pod \"3683c554-eec7-4825-8972-0445faf15a23\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.156920 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3683c554-eec7-4825-8972-0445faf15a23-ovn-rundir\") pod \"3683c554-eec7-4825-8972-0445faf15a23\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.157031 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-config\") pod \"3683c554-eec7-4825-8972-0445faf15a23\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.157063 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m48s\" (UniqueName: \"kubernetes.io/projected/3683c554-eec7-4825-8972-0445faf15a23-kube-api-access-7m48s\") pod \"3683c554-eec7-4825-8972-0445faf15a23\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.157089 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-ovn-northd-tls-certs\") pod \"3683c554-eec7-4825-8972-0445faf15a23\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.157140 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-scripts\") pod \"3683c554-eec7-4825-8972-0445faf15a23\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.157610 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-combined-ca-bundle\") pod \"3683c554-eec7-4825-8972-0445faf15a23\" (UID: \"3683c554-eec7-4825-8972-0445faf15a23\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.158111 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.160091 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-config" (OuterVolumeSpecName: "config") pod "3683c554-eec7-4825-8972-0445faf15a23" (UID: "3683c554-eec7-4825-8972-0445faf15a23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.160478 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3683c554-eec7-4825-8972-0445faf15a23-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "3683c554-eec7-4825-8972-0445faf15a23" (UID: "3683c554-eec7-4825-8972-0445faf15a23"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.161090 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-scripts" (OuterVolumeSpecName: "scripts") pod "3683c554-eec7-4825-8972-0445faf15a23" (UID: "3683c554-eec7-4825-8972-0445faf15a23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.171501 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3683c554-eec7-4825-8972-0445faf15a23-kube-api-access-7m48s" (OuterVolumeSpecName: "kube-api-access-7m48s") pod "3683c554-eec7-4825-8972-0445faf15a23" (UID: "3683c554-eec7-4825-8972-0445faf15a23"). InnerVolumeSpecName "kube-api-access-7m48s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.178660 4780 scope.go:117] "RemoveContainer" containerID="64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a" Sep 29 19:05:20 crc kubenswrapper[4780]: E0929 19:05:20.179362 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a\": container with ID starting with 64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a not found: ID does not exist" containerID="64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.179403 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a"} err="failed to get container status \"64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a\": rpc error: code = NotFound desc = could not find container \"64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a\": container with ID starting with 64cedc3d06ecbdf5c0ae07de8fe1d464415b4216c1e8b7ce810514c3508ff44a not found: ID does not exist" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.179431 4780 scope.go:117] "RemoveContainer" containerID="1bf559685fa2ec4cdd7e6a75b9791a1126fabef80566a5ec0ad7a61e63c638ce" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.216467 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3683c554-eec7-4825-8972-0445faf15a23" (UID: "3683c554-eec7-4825-8972-0445faf15a23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.223956 4780 scope.go:117] "RemoveContainer" containerID="325c5a786e1853685f85b6fc6f7bdd59ae3bf16fce0c288ff4e80cd0ba149002" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.259417 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.259449 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m48s\" (UniqueName: \"kubernetes.io/projected/3683c554-eec7-4825-8972-0445faf15a23-kube-api-access-7m48s\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.259462 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3683c554-eec7-4825-8972-0445faf15a23-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.259475 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.259486 4780 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3683c554-eec7-4825-8972-0445faf15a23-ovn-rundir\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.262192 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3683c554-eec7-4825-8972-0445faf15a23" (UID: "3683c554-eec7-4825-8972-0445faf15a23"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.270187 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "3683c554-eec7-4825-8972-0445faf15a23" (UID: "3683c554-eec7-4825-8972-0445faf15a23"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.364568 4780 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.364613 4780 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3683c554-eec7-4825-8972-0445faf15a23-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.404012 4780 scope.go:117] "RemoveContainer" containerID="0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.415651 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.417255 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-79b866b5dd-2f72g" event={"ID":"8e1d2b75-0893-468d-8365-f08fa8875575","Type":"ContainerDied","Data":"3c9d220b4cfd41006d2222c21a7022c7d464d547797c016309f69d63385cbc99"} Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.420716 4780 generic.go:334] "Generic (PLEG): container finished" podID="ef4fe84d-ff10-4ed2-938a-669c30748336" containerID="c0d63b73993fb464d59534865d2e91fd588cd9d7e1421e00fa83f404e4d2d957" exitCode=0 Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.420788 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65cff5765c-kflf7" event={"ID":"ef4fe84d-ff10-4ed2-938a-669c30748336","Type":"ContainerDied","Data":"c0d63b73993fb464d59534865d2e91fd588cd9d7e1421e00fa83f404e4d2d957"} Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.423378 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3683c554-eec7-4825-8972-0445faf15a23/ovn-northd/0.log" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.423447 4780 generic.go:334] "Generic (PLEG): container finished" podID="3683c554-eec7-4825-8972-0445faf15a23" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" exitCode=139 Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.423499 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.423548 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3683c554-eec7-4825-8972-0445faf15a23","Type":"ContainerDied","Data":"28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6"} Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.423626 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3683c554-eec7-4825-8972-0445faf15a23","Type":"ContainerDied","Data":"b201ab686e7ffc57334c1ed50e9946de9b263459962e100d7ba2be026512b575"} Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.433345 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.433369 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48191511-38e9-46d2-82f8-77453769927c","Type":"ContainerDied","Data":"3d2e10a927a85365dd4b2fa87dcee5ec79b99f946e90d4eb60c6b829653af6ee"} Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.440315 4780 scope.go:117] "RemoveContainer" containerID="f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.455521 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-79b866b5dd-2f72g"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.459711 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.463230 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.463291 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74988cff4c-fmczd" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.463798 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.463820 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.465177 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.488015 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-79b866b5dd-2f72g"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.501085 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.501881 4780 scope.go:117] "RemoveContainer" containerID="0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0" Sep 29 19:05:20 crc kubenswrapper[4780]: E0929 19:05:20.502306 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0\": container with ID starting with 0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0 not found: ID does not exist" containerID="0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.502338 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0"} err="failed to get container status \"0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0\": rpc error: code = NotFound desc = could not find container \"0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0\": container with ID starting with 0f0c140bd1c18d27a61395e7ab256190d7e1c30d636fbd034038bde07a5e87a0 not found: ID does not exist" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.502367 4780 scope.go:117] "RemoveContainer" containerID="f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b" Sep 29 19:05:20 crc kubenswrapper[4780]: E0929 19:05:20.502639 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b\": container with ID starting with f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b not found: ID does not exist" containerID="f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.502692 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b"} err="failed to get container status \"f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b\": rpc error: code = NotFound desc = could not find container \"f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b\": container with ID starting with f50d9dd816230dcd008ea892fcd39784e69e39f625e56a12dc24c211b505465b not found: ID does not exist" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.502711 4780 scope.go:117] "RemoveContainer" containerID="3383dec5afb930a94bac2f13b88f7cee0613113811cd7f239db85c1098ebc8d7" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.512879 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.527985 4780 scope.go:117] "RemoveContainer" containerID="a3bd0c44347129dd3eb6a433ef0ca6e0cd25372b1057012a51a9c55da4d8ff4a" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.584974 4780 scope.go:117] "RemoveContainer" containerID="4cac85a12f0ad40a5dc9410707339b2f4a75fbc7d7e9f99310b24f564e2e4f03" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.606074 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.612494 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.626621 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-74988cff4c-fmczd"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.630358 4780 scope.go:117] "RemoveContainer" containerID="4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.636949 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-74988cff4c-fmczd"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.646643 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.663095 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.674856 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.684151 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.689237 4780 scope.go:117] "RemoveContainer" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.690803 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.701840 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.719777 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.723184 4780 scope.go:117] "RemoveContainer" containerID="4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083" Sep 29 19:05:20 crc kubenswrapper[4780]: E0929 19:05:20.725154 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083\": container with ID starting with 4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083 not found: ID does not exist" containerID="4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.725190 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083"} err="failed to get container status \"4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083\": rpc error: code = NotFound desc = could not find container \"4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083\": container with ID starting with 4418f46aa952e590892b86a57e9a08559ae62e9f515821f563275bc6012a5083 not found: ID does not exist" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.725215 4780 scope.go:117] "RemoveContainer" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" Sep 29 19:05:20 crc kubenswrapper[4780]: E0929 19:05:20.727117 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6\": container with ID starting with 28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6 not found: ID does not exist" containerID="28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.727141 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6"} err="failed to get container status \"28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6\": rpc error: code = NotFound desc = could not find container \"28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6\": container with ID starting with 28c92cf41e05f2ec2d2bef0057fa63bfe106ccdd28128cb08dedb89f890782f6 not found: ID does not exist" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.727156 4780 scope.go:117] "RemoveContainer" containerID="f87b8bafb323301052d22ea81d2721d5221500537424fea022247a8e792a03e3" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.727283 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.735138 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.740092 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.745174 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.754732 4780 scope.go:117] "RemoveContainer" containerID="463b5f829fe19e14d7099fe53489e9ad00fece5bc4c41ff73db16873402d77dc" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.773990 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-scripts\") pod \"ef4fe84d-ff10-4ed2-938a-669c30748336\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.774227 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-public-tls-certs\") pod \"ef4fe84d-ff10-4ed2-938a-669c30748336\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.774262 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-credential-keys\") pod \"ef4fe84d-ff10-4ed2-938a-669c30748336\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.774293 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-combined-ca-bundle\") pod \"ef4fe84d-ff10-4ed2-938a-669c30748336\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.774368 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-internal-tls-certs\") pod \"ef4fe84d-ff10-4ed2-938a-669c30748336\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.774420 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-fernet-keys\") pod \"ef4fe84d-ff10-4ed2-938a-669c30748336\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.774490 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-config-data\") pod \"ef4fe84d-ff10-4ed2-938a-669c30748336\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.774528 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6948l\" (UniqueName: \"kubernetes.io/projected/ef4fe84d-ff10-4ed2-938a-669c30748336-kube-api-access-6948l\") pod \"ef4fe84d-ff10-4ed2-938a-669c30748336\" (UID: \"ef4fe84d-ff10-4ed2-938a-669c30748336\") " Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.785461 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ef4fe84d-ff10-4ed2-938a-669c30748336" (UID: "ef4fe84d-ff10-4ed2-938a-669c30748336"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.785596 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4fe84d-ff10-4ed2-938a-669c30748336-kube-api-access-6948l" (OuterVolumeSpecName: "kube-api-access-6948l") pod "ef4fe84d-ff10-4ed2-938a-669c30748336" (UID: "ef4fe84d-ff10-4ed2-938a-669c30748336"). InnerVolumeSpecName "kube-api-access-6948l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.801395 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ef4fe84d-ff10-4ed2-938a-669c30748336" (UID: "ef4fe84d-ff10-4ed2-938a-669c30748336"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.803080 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-scripts" (OuterVolumeSpecName: "scripts") pod "ef4fe84d-ff10-4ed2-938a-669c30748336" (UID: "ef4fe84d-ff10-4ed2-938a-669c30748336"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.803381 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3683c554-eec7-4825-8972-0445faf15a23" path="/var/lib/kubelet/pods/3683c554-eec7-4825-8972-0445faf15a23/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.804174 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48191511-38e9-46d2-82f8-77453769927c" path="/var/lib/kubelet/pods/48191511-38e9-46d2-82f8-77453769927c/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.804743 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503714fd-6dcf-4b1d-8806-dd78a3e85b7f" path="/var/lib/kubelet/pods/503714fd-6dcf-4b1d-8806-dd78a3e85b7f/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.806316 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ef0b7e-a06d-49a2-824e-9f088c267a97" path="/var/lib/kubelet/pods/58ef0b7e-a06d-49a2-824e-9f088c267a97/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.806799 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5ccc95-6c2c-4f3c-884b-456cf28d6db4" path="/var/lib/kubelet/pods/5d5ccc95-6c2c-4f3c-884b-456cf28d6db4/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.807428 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6105150b-678d-4925-a981-9a0d75377f32" path="/var/lib/kubelet/pods/6105150b-678d-4925-a981-9a0d75377f32/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.808840 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="622d766f-f43c-434c-9353-2315a6c82ae6" path="/var/lib/kubelet/pods/622d766f-f43c-434c-9353-2315a6c82ae6/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.810663 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6422eb63-373a-4b79-88b0-ddd623f7bd79" path="/var/lib/kubelet/pods/6422eb63-373a-4b79-88b0-ddd623f7bd79/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.814419 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" path="/var/lib/kubelet/pods/6c538b0f-23b3-440d-9775-5f33f7badfd4/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.815692 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" path="/var/lib/kubelet/pods/792eb9b5-5b6a-4c61-bc3f-8ab53d64a248/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.816396 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" path="/var/lib/kubelet/pods/8150bb34-1bc0-4c45-92f8-9d8d04f611e3/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.832420 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f061df-a5ff-4db1-b87f-4106a5e56b55" path="/var/lib/kubelet/pods/83f061df-a5ff-4db1-b87f-4106a5e56b55/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.836699 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1d2b75-0893-468d-8365-f08fa8875575" path="/var/lib/kubelet/pods/8e1d2b75-0893-468d-8365-f08fa8875575/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.837596 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" path="/var/lib/kubelet/pods/aa6b4d2f-2f81-44fd-8c76-2aa6204209c3/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.837600 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-config-data" (OuterVolumeSpecName: "config-data") pod "ef4fe84d-ff10-4ed2-938a-669c30748336" (UID: "ef4fe84d-ff10-4ed2-938a-669c30748336"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.838229 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7f300da-65dd-4c6e-ae4a-63b797768651" path="/var/lib/kubelet/pods/b7f300da-65dd-4c6e-ae4a-63b797768651/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.839589 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc401926-3969-448c-9910-22572fecb168" path="/var/lib/kubelet/pods/bc401926-3969-448c-9910-22572fecb168/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.840361 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ee2741-9417-4698-b550-7c596d00d271" path="/var/lib/kubelet/pods/d2ee2741-9417-4698-b550-7c596d00d271/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.841440 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" path="/var/lib/kubelet/pods/e42e5bce-9395-4758-8121-35408b6df2e2/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.842207 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec846e3f-c11b-4818-a15b-9f855ed48a56" path="/var/lib/kubelet/pods/ec846e3f-c11b-4818-a15b-9f855ed48a56/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.842705 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed88e38f-cb35-4072-8f9f-1c6ab980ec03" path="/var/lib/kubelet/pods/ed88e38f-cb35-4072-8f9f-1c6ab980ec03/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.843742 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef4fe84d-ff10-4ed2-938a-669c30748336" (UID: "ef4fe84d-ff10-4ed2-938a-669c30748336"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.843775 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed2917c-127a-4dbd-b951-6b141853e47c" path="/var/lib/kubelet/pods/eed2917c-127a-4dbd-b951-6b141853e47c/volumes" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.845619 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ef4fe84d-ff10-4ed2-938a-669c30748336" (UID: "ef4fe84d-ff10-4ed2-938a-669c30748336"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.856809 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ef4fe84d-ff10-4ed2-938a-669c30748336" (UID: "ef4fe84d-ff10-4ed2-938a-669c30748336"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.867914 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.167:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.877734 4780 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.877789 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.877803 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6948l\" (UniqueName: \"kubernetes.io/projected/ef4fe84d-ff10-4ed2-938a-669c30748336-kube-api-access-6948l\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.877815 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.877825 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.877835 4780 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.877844 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:20 crc kubenswrapper[4780]: I0929 19:05:20.877852 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4fe84d-ff10-4ed2-938a-669c30748336-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:21 crc kubenswrapper[4780]: I0929 19:05:21.478107 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-65cff5765c-kflf7" event={"ID":"ef4fe84d-ff10-4ed2-938a-669c30748336","Type":"ContainerDied","Data":"8af15a8fae669be335a3c5a1557e9cfb34432f48bf945c3a6b737ebed91799b8"} Sep 29 19:05:21 crc kubenswrapper[4780]: I0929 19:05:21.478169 4780 scope.go:117] "RemoveContainer" containerID="c0d63b73993fb464d59534865d2e91fd588cd9d7e1421e00fa83f404e4d2d957" Sep 29 19:05:21 crc kubenswrapper[4780]: I0929 19:05:21.478177 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-65cff5765c-kflf7" Sep 29 19:05:21 crc kubenswrapper[4780]: I0929 19:05:21.514929 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-65cff5765c-kflf7"] Sep 29 19:05:21 crc kubenswrapper[4780]: I0929 19:05:21.525883 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-65cff5765c-kflf7"] Sep 29 19:05:22 crc kubenswrapper[4780]: I0929 19:05:22.764295 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef4fe84d-ff10-4ed2-938a-669c30748336" path="/var/lib/kubelet/pods/ef4fe84d-ff10-4ed2-938a-669c30748336/volumes" Sep 29 19:05:23 crc kubenswrapper[4780]: E0929 19:05:23.734221 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:23 crc kubenswrapper[4780]: E0929 19:05:23.735013 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:23 crc kubenswrapper[4780]: E0929 19:05:23.735174 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:23 crc kubenswrapper[4780]: E0929 19:05:23.735514 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:23 crc kubenswrapper[4780]: E0929 19:05:23.735602 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" Sep 29 19:05:23 crc kubenswrapper[4780]: E0929 19:05:23.736570 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:23 crc kubenswrapper[4780]: E0929 19:05:23.738177 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:23 crc kubenswrapper[4780]: E0929 19:05:23.738257 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.428608 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.533702 4780 generic.go:334] "Generic (PLEG): container finished" podID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerID="3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d" exitCode=0 Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.533738 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d954bbbf5-jklnq" event={"ID":"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8","Type":"ContainerDied","Data":"3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d"} Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.533777 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d954bbbf5-jklnq" event={"ID":"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8","Type":"ContainerDied","Data":"198fd34207081e7daec72f7217803b559a57952b4bf8e4b0c5ece602f309f027"} Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.533794 4780 scope.go:117] "RemoveContainer" containerID="3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.533809 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d954bbbf5-jklnq" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.552793 4780 scope.go:117] "RemoveContainer" containerID="3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.552865 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-config\") pod \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.552913 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-public-tls-certs\") pod \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.552994 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-ovndb-tls-certs\") pod \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.553013 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-combined-ca-bundle\") pod \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.553090 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-httpd-config\") pod \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.553150 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp2bw\" (UniqueName: \"kubernetes.io/projected/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-kube-api-access-cp2bw\") pod \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.553194 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-internal-tls-certs\") pod \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\" (UID: \"aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8\") " Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.565633 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-kube-api-access-cp2bw" (OuterVolumeSpecName: "kube-api-access-cp2bw") pod "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" (UID: "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8"). InnerVolumeSpecName "kube-api-access-cp2bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.570712 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" (UID: "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.591853 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" (UID: "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.595537 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-config" (OuterVolumeSpecName: "config") pod "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" (UID: "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.598499 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" (UID: "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.608188 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" (UID: "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.614014 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" (UID: "aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.650664 4780 scope.go:117] "RemoveContainer" containerID="3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92" Sep 29 19:05:25 crc kubenswrapper[4780]: E0929 19:05:25.651194 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92\": container with ID starting with 3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92 not found: ID does not exist" containerID="3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.651247 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92"} err="failed to get container status \"3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92\": rpc error: code = NotFound desc = could not find container \"3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92\": container with ID starting with 3cc39ad60ff5917747ffdd9279bc7690d9c9f92d6e3efe4f74b3e01ca0ff3e92 not found: ID does not exist" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.651285 4780 scope.go:117] "RemoveContainer" containerID="3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d" Sep 29 19:05:25 crc kubenswrapper[4780]: E0929 19:05:25.651570 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d\": container with ID starting with 3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d not found: ID does not exist" containerID="3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.651594 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d"} err="failed to get container status \"3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d\": rpc error: code = NotFound desc = could not find container \"3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d\": container with ID starting with 3ba2d8985cf9df39727c85d40800239a32b5ff7208c524280e572adacb68331d not found: ID does not exist" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.654486 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.654515 4780 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.654527 4780 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.654538 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.654552 4780 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.654563 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp2bw\" (UniqueName: \"kubernetes.io/projected/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-kube-api-access-cp2bw\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.654573 4780 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.873430 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d954bbbf5-jklnq"] Sep 29 19:05:25 crc kubenswrapper[4780]: I0929 19:05:25.878036 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d954bbbf5-jklnq"] Sep 29 19:05:26 crc kubenswrapper[4780]: I0929 19:05:26.764305 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" path="/var/lib/kubelet/pods/aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8/volumes" Sep 29 19:05:28 crc kubenswrapper[4780]: E0929 19:05:28.732714 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:28 crc kubenswrapper[4780]: E0929 19:05:28.733759 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:28 crc kubenswrapper[4780]: E0929 19:05:28.734412 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:28 crc kubenswrapper[4780]: E0929 19:05:28.734533 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" Sep 29 19:05:28 crc kubenswrapper[4780]: E0929 19:05:28.734750 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:28 crc kubenswrapper[4780]: E0929 19:05:28.736434 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:28 crc kubenswrapper[4780]: E0929 19:05:28.737538 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:28 crc kubenswrapper[4780]: E0929 19:05:28.737575 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" Sep 29 19:05:33 crc kubenswrapper[4780]: E0929 19:05:33.733711 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:33 crc kubenswrapper[4780]: E0929 19:05:33.735126 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:33 crc kubenswrapper[4780]: E0929 19:05:33.735671 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:33 crc kubenswrapper[4780]: E0929 19:05:33.735785 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" Sep 29 19:05:33 crc kubenswrapper[4780]: E0929 19:05:33.735875 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:33 crc kubenswrapper[4780]: E0929 19:05:33.738397 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:33 crc kubenswrapper[4780]: E0929 19:05:33.740268 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:33 crc kubenswrapper[4780]: E0929 19:05:33.740397 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" Sep 29 19:05:38 crc kubenswrapper[4780]: E0929 19:05:38.733326 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:38 crc kubenswrapper[4780]: E0929 19:05:38.735004 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:38 crc kubenswrapper[4780]: E0929 19:05:38.735322 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:38 crc kubenswrapper[4780]: E0929 19:05:38.735841 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Sep 29 19:05:38 crc kubenswrapper[4780]: E0929 19:05:38.736150 4780 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" Sep 29 19:05:38 crc kubenswrapper[4780]: E0929 19:05:38.736727 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:38 crc kubenswrapper[4780]: E0929 19:05:38.738609 4780 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Sep 29 19:05:38 crc kubenswrapper[4780]: E0929 19:05:38.738659 4780 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tqkx6" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.687672 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqkx6_3c91af49-2adc-47a1-892c-82da3b338492/ovs-vswitchd/0.log" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.689751 4780 generic.go:334] "Generic (PLEG): container finished" podID="3c91af49-2adc-47a1-892c-82da3b338492" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" exitCode=137 Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.689827 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqkx6" event={"ID":"3c91af49-2adc-47a1-892c-82da3b338492","Type":"ContainerDied","Data":"d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de"} Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.808927 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqkx6_3c91af49-2adc-47a1-892c-82da3b338492/ovs-vswitchd/0.log" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.810023 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.893459 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-run\") pod \"3c91af49-2adc-47a1-892c-82da3b338492\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.893558 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78swl\" (UniqueName: \"kubernetes.io/projected/3c91af49-2adc-47a1-892c-82da3b338492-kube-api-access-78swl\") pod \"3c91af49-2adc-47a1-892c-82da3b338492\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.893582 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-log\") pod \"3c91af49-2adc-47a1-892c-82da3b338492\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.893616 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c91af49-2adc-47a1-892c-82da3b338492-scripts\") pod \"3c91af49-2adc-47a1-892c-82da3b338492\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.893646 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-lib\") pod \"3c91af49-2adc-47a1-892c-82da3b338492\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.893707 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-etc-ovs\") pod \"3c91af49-2adc-47a1-892c-82da3b338492\" (UID: \"3c91af49-2adc-47a1-892c-82da3b338492\") " Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.893777 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-log" (OuterVolumeSpecName: "var-log") pod "3c91af49-2adc-47a1-892c-82da3b338492" (UID: "3c91af49-2adc-47a1-892c-82da3b338492"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.893822 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-lib" (OuterVolumeSpecName: "var-lib") pod "3c91af49-2adc-47a1-892c-82da3b338492" (UID: "3c91af49-2adc-47a1-892c-82da3b338492"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.893920 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "3c91af49-2adc-47a1-892c-82da3b338492" (UID: "3c91af49-2adc-47a1-892c-82da3b338492"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.894419 4780 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-log\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.894706 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-run" (OuterVolumeSpecName: "var-run") pod "3c91af49-2adc-47a1-892c-82da3b338492" (UID: "3c91af49-2adc-47a1-892c-82da3b338492"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.894715 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c91af49-2adc-47a1-892c-82da3b338492-scripts" (OuterVolumeSpecName: "scripts") pod "3c91af49-2adc-47a1-892c-82da3b338492" (UID: "3c91af49-2adc-47a1-892c-82da3b338492"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.894721 4780 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-lib\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.894757 4780 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-etc-ovs\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.899409 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c91af49-2adc-47a1-892c-82da3b338492-kube-api-access-78swl" (OuterVolumeSpecName: "kube-api-access-78swl") pod "3c91af49-2adc-47a1-892c-82da3b338492" (UID: "3c91af49-2adc-47a1-892c-82da3b338492"). InnerVolumeSpecName "kube-api-access-78swl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.995681 4780 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c91af49-2adc-47a1-892c-82da3b338492-var-run\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.995728 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78swl\" (UniqueName: \"kubernetes.io/projected/3c91af49-2adc-47a1-892c-82da3b338492-kube-api-access-78swl\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:40 crc kubenswrapper[4780]: I0929 19:05:40.995740 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c91af49-2adc-47a1-892c-82da3b338492-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:41 crc kubenswrapper[4780]: I0929 19:05:41.702086 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tqkx6_3c91af49-2adc-47a1-892c-82da3b338492/ovs-vswitchd/0.log" Sep 29 19:05:41 crc kubenswrapper[4780]: I0929 19:05:41.703316 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tqkx6" event={"ID":"3c91af49-2adc-47a1-892c-82da3b338492","Type":"ContainerDied","Data":"1344e1d34b72f29344aad148d07f6a2a075170c1e62311fa75d1c29069ca2804"} Sep 29 19:05:41 crc kubenswrapper[4780]: I0929 19:05:41.703393 4780 scope.go:117] "RemoveContainer" containerID="d7690415a09e0c16aee2c647fcdd103c63059bcd0ad03837d14fdd8ce81046de" Sep 29 19:05:41 crc kubenswrapper[4780]: I0929 19:05:41.703613 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tqkx6" Sep 29 19:05:41 crc kubenswrapper[4780]: I0929 19:05:41.735565 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tqkx6"] Sep 29 19:05:41 crc kubenswrapper[4780]: I0929 19:05:41.741643 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-tqkx6"] Sep 29 19:05:41 crc kubenswrapper[4780]: I0929 19:05:41.750628 4780 scope.go:117] "RemoveContainer" containerID="0fed10498ed54ec6c8b58702d8be0e27db0aa2385504b2a22a8361b759e08eea" Sep 29 19:05:41 crc kubenswrapper[4780]: I0929 19:05:41.775601 4780 scope.go:117] "RemoveContainer" containerID="56cd11a363afa5285113dcd494182baca0f5cd0564a4c59d2c667f8b958be968" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.474195 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.528518 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.528660 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-lock\") pod \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.528718 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-cache\") pod \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.528771 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blc65\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-kube-api-access-blc65\") pod \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.528801 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") pod \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\" (UID: \"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1\") " Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.529381 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-lock" (OuterVolumeSpecName: "lock") pod "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.530290 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-cache" (OuterVolumeSpecName: "cache") pod "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.533195 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "swift") pod "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.533418 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-kube-api-access-blc65" (OuterVolumeSpecName: "kube-api-access-blc65") pod "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1"). InnerVolumeSpecName "kube-api-access-blc65". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.537379 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" (UID: "d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.630160 4780 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.630315 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.630334 4780 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-lock\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.630346 4780 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-cache\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.630358 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blc65\" (UniqueName: \"kubernetes.io/projected/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1-kube-api-access-blc65\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.647405 4780 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.720747 4780 generic.go:334] "Generic (PLEG): container finished" podID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerID="ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a" exitCode=137 Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.720838 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a"} Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.720870 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1","Type":"ContainerDied","Data":"67a95d1bfbd9000e22869418dd0095b23f2601ac1a14e9957d34dae1a5b2a75e"} Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.720888 4780 scope.go:117] "RemoveContainer" containerID="ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.720976 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.731504 4780 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.741841 4780 scope.go:117] "RemoveContainer" containerID="7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.776166 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c91af49-2adc-47a1-892c-82da3b338492" path="/var/lib/kubelet/pods/3c91af49-2adc-47a1-892c-82da3b338492/volumes" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.777708 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.777781 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.784151 4780 scope.go:117] "RemoveContainer" containerID="967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.832700 4780 scope.go:117] "RemoveContainer" containerID="a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.856461 4780 scope.go:117] "RemoveContainer" containerID="2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.875539 4780 scope.go:117] "RemoveContainer" containerID="3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.906869 4780 scope.go:117] "RemoveContainer" containerID="ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.925237 4780 scope.go:117] "RemoveContainer" containerID="f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.944708 4780 scope.go:117] "RemoveContainer" containerID="ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.962004 4780 scope.go:117] "RemoveContainer" containerID="31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311" Sep 29 19:05:42 crc kubenswrapper[4780]: I0929 19:05:42.981969 4780 scope.go:117] "RemoveContainer" containerID="28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.002339 4780 scope.go:117] "RemoveContainer" containerID="229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.022300 4780 scope.go:117] "RemoveContainer" containerID="87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.041388 4780 scope.go:117] "RemoveContainer" containerID="c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.062029 4780 scope.go:117] "RemoveContainer" containerID="00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.094836 4780 scope.go:117] "RemoveContainer" containerID="ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.095895 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a\": container with ID starting with ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a not found: ID does not exist" containerID="ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.095954 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a"} err="failed to get container status \"ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a\": rpc error: code = NotFound desc = could not find container \"ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a\": container with ID starting with ebc77cbe103aaf3e38290f7fc55b85471c8c897be742f7cfe2450acd9100a57a not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.095993 4780 scope.go:117] "RemoveContainer" containerID="7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.096576 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea\": container with ID starting with 7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea not found: ID does not exist" containerID="7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.096686 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea"} err="failed to get container status \"7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea\": rpc error: code = NotFound desc = could not find container \"7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea\": container with ID starting with 7861af5a650bc82faf757c1890ec4a84a24a3691c6460e571f4352a7d49f58ea not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.096773 4780 scope.go:117] "RemoveContainer" containerID="967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.097480 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5\": container with ID starting with 967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5 not found: ID does not exist" containerID="967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.097576 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5"} err="failed to get container status \"967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5\": rpc error: code = NotFound desc = could not find container \"967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5\": container with ID starting with 967c1784472bd2c7c5ffd294f4cf6bbb888986426f4c84cbd577110ca9cbe8b5 not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.097640 4780 scope.go:117] "RemoveContainer" containerID="a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.098569 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2\": container with ID starting with a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2 not found: ID does not exist" containerID="a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.098652 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2"} err="failed to get container status \"a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2\": rpc error: code = NotFound desc = could not find container \"a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2\": container with ID starting with a5d02e1679affd507d7330f0728958bb89052009e5c8ce0520191a06c3a607d2 not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.098707 4780 scope.go:117] "RemoveContainer" containerID="2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.099251 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b\": container with ID starting with 2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b not found: ID does not exist" containerID="2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.099295 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b"} err="failed to get container status \"2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b\": rpc error: code = NotFound desc = could not find container \"2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b\": container with ID starting with 2eeafb02dc091f4f2dc3c5c694d6c7287517acfbf1bf729d6c389b55f4cd560b not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.099323 4780 scope.go:117] "RemoveContainer" containerID="3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.099749 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6\": container with ID starting with 3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6 not found: ID does not exist" containerID="3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.099834 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6"} err="failed to get container status \"3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6\": rpc error: code = NotFound desc = could not find container \"3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6\": container with ID starting with 3cbe422d6fe9013f115d09f2b0e282fe1dd1dff0efb16e59e9c2955d064f2ba6 not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.100179 4780 scope.go:117] "RemoveContainer" containerID="ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.100520 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8\": container with ID starting with ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8 not found: ID does not exist" containerID="ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.100598 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8"} err="failed to get container status \"ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8\": rpc error: code = NotFound desc = could not find container \"ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8\": container with ID starting with ee72278837467215852150dc2c03aeda616c338e1c9ee75752ee46e074518aa8 not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.100664 4780 scope.go:117] "RemoveContainer" containerID="f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.101139 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435\": container with ID starting with f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435 not found: ID does not exist" containerID="f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.101222 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435"} err="failed to get container status \"f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435\": rpc error: code = NotFound desc = could not find container \"f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435\": container with ID starting with f6c595631e99c829067019f99e0e54f5514f7d5735a8e711b1ab085e8be4d435 not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.101287 4780 scope.go:117] "RemoveContainer" containerID="ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.101687 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09\": container with ID starting with ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09 not found: ID does not exist" containerID="ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.101726 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09"} err="failed to get container status \"ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09\": rpc error: code = NotFound desc = could not find container \"ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09\": container with ID starting with ba8432110dceb3ea97a31b5946f8d38f4b00b9a43249f0a2911b5380a99c8b09 not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.101751 4780 scope.go:117] "RemoveContainer" containerID="31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.102106 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311\": container with ID starting with 31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311 not found: ID does not exist" containerID="31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.102193 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311"} err="failed to get container status \"31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311\": rpc error: code = NotFound desc = could not find container \"31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311\": container with ID starting with 31d14f69ba8efaa6de13896ff1384f8beeb59abb00b82b8c900509159f1ba311 not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.102263 4780 scope.go:117] "RemoveContainer" containerID="28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.102633 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf\": container with ID starting with 28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf not found: ID does not exist" containerID="28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.102708 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf"} err="failed to get container status \"28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf\": rpc error: code = NotFound desc = could not find container \"28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf\": container with ID starting with 28083de416c69bfd8fd2b033b94c2cab31d43da17f3560191242c07d094088bf not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.102769 4780 scope.go:117] "RemoveContainer" containerID="229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.103170 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61\": container with ID starting with 229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61 not found: ID does not exist" containerID="229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.103242 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61"} err="failed to get container status \"229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61\": rpc error: code = NotFound desc = could not find container \"229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61\": container with ID starting with 229190a87f1adeeb66d2323a772fb3438ea9378abd2ddbc67cbd9bf125289d61 not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.103304 4780 scope.go:117] "RemoveContainer" containerID="87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.103699 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1\": container with ID starting with 87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1 not found: ID does not exist" containerID="87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.103753 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1"} err="failed to get container status \"87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1\": rpc error: code = NotFound desc = could not find container \"87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1\": container with ID starting with 87c79f8b5e95e378d386fee73e771bf1a19f520505cb5afcef4542be5c8457e1 not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.103773 4780 scope.go:117] "RemoveContainer" containerID="c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.104181 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b\": container with ID starting with c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b not found: ID does not exist" containerID="c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.104254 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b"} err="failed to get container status \"c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b\": rpc error: code = NotFound desc = could not find container \"c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b\": container with ID starting with c490cd79ccba7f9d033d9093864952d09ea29212e8449020b46c24393a91ec3b not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.104316 4780 scope.go:117] "RemoveContainer" containerID="00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.104902 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f\": container with ID starting with 00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f not found: ID does not exist" containerID="00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.104947 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f"} err="failed to get container status \"00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f\": rpc error: code = NotFound desc = could not find container \"00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f\": container with ID starting with 00ad9333136ba9e94ef816d11b1fd4f1df25863d0bca0ab419dacc57f50a1f7f not found: ID does not exist" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.640471 4780 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod00f3e1c2-9a7e-42d1-8aa8-396285ea40c8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod00f3e1c2-9a7e-42d1-8aa8-396285ea40c8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod00f3e1c2_9a7e_42d1_8aa8_396285ea40c8.slice" Sep 29 19:05:43 crc kubenswrapper[4780]: E0929 19:05:43.640523 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod00f3e1c2-9a7e-42d1-8aa8-396285ea40c8] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod00f3e1c2-9a7e-42d1-8aa8-396285ea40c8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod00f3e1c2_9a7e_42d1_8aa8_396285ea40c8.slice" pod="openstack/ovn-controller-metrics-8vsrs" podUID="00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.733199 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8vsrs" Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.774020 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8vsrs"] Sep 29 19:05:43 crc kubenswrapper[4780]: I0929 19:05:43.779452 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-8vsrs"] Sep 29 19:05:44 crc kubenswrapper[4780]: I0929 19:05:44.761085 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00f3e1c2-9a7e-42d1-8aa8-396285ea40c8" path="/var/lib/kubelet/pods/00f3e1c2-9a7e-42d1-8aa8-396285ea40c8/volumes" Sep 29 19:05:44 crc kubenswrapper[4780]: I0929 19:05:44.762247 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" path="/var/lib/kubelet/pods/d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1/volumes" Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.409265 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancefda4-account-delete-4ndzp" Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.542320 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w78j7\" (UniqueName: \"kubernetes.io/projected/f271f9ca-bced-4144-b779-06e7422d9a63-kube-api-access-w78j7\") pod \"f271f9ca-bced-4144-b779-06e7422d9a63\" (UID: \"f271f9ca-bced-4144-b779-06e7422d9a63\") " Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.559222 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f271f9ca-bced-4144-b779-06e7422d9a63-kube-api-access-w78j7" (OuterVolumeSpecName: "kube-api-access-w78j7") pod "f271f9ca-bced-4144-b779-06e7422d9a63" (UID: "f271f9ca-bced-4144-b779-06e7422d9a63"). InnerVolumeSpecName "kube-api-access-w78j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.644036 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w78j7\" (UniqueName: \"kubernetes.io/projected/f271f9ca-bced-4144-b779-06e7422d9a63-kube-api-access-w78j7\") on node \"crc\" DevicePath \"\"" Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.785540 4780 generic.go:334] "Generic (PLEG): container finished" podID="f271f9ca-bced-4144-b779-06e7422d9a63" containerID="ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c" exitCode=137 Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.785594 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancefda4-account-delete-4ndzp" event={"ID":"f271f9ca-bced-4144-b779-06e7422d9a63","Type":"ContainerDied","Data":"ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c"} Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.785628 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancefda4-account-delete-4ndzp" event={"ID":"f271f9ca-bced-4144-b779-06e7422d9a63","Type":"ContainerDied","Data":"7da71a54065fafc75a333926d18e4c6e000967115b3c89d4cc36c0a56cbc1e01"} Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.785649 4780 scope.go:117] "RemoveContainer" containerID="ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c" Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.785797 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancefda4-account-delete-4ndzp" Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.809183 4780 scope.go:117] "RemoveContainer" containerID="ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c" Sep 29 19:05:48 crc kubenswrapper[4780]: E0929 19:05:48.809942 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c\": container with ID starting with ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c not found: ID does not exist" containerID="ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c" Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.810017 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c"} err="failed to get container status \"ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c\": rpc error: code = NotFound desc = could not find container \"ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c\": container with ID starting with ae5a61cb362b01ed7e0442bfac3c2f02b0e005a9eb5da83b8a8ece5e9700762c not found: ID does not exist" Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.811527 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancefda4-account-delete-4ndzp"] Sep 29 19:05:48 crc kubenswrapper[4780]: I0929 19:05:48.815667 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glancefda4-account-delete-4ndzp"] Sep 29 19:05:49 crc kubenswrapper[4780]: I0929 19:05:49.506913 4780 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podb90472c3-a09d-433c-922b-d164a11636e6"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podb90472c3-a09d-433c-922b-d164a11636e6] : Timed out while waiting for systemd to remove kubepods-burstable-podb90472c3_a09d_433c_922b_d164a11636e6.slice" Sep 29 19:05:49 crc kubenswrapper[4780]: E0929 19:05:49.507009 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable podb90472c3-a09d-433c-922b-d164a11636e6] : unable to destroy cgroup paths for cgroup [kubepods burstable podb90472c3-a09d-433c-922b-d164a11636e6] : Timed out while waiting for systemd to remove kubepods-burstable-podb90472c3_a09d_433c_922b_d164a11636e6.slice" pod="openstack/rabbitmq-cell1-server-0" podUID="b90472c3-a09d-433c-922b-d164a11636e6" Sep 29 19:05:49 crc kubenswrapper[4780]: I0929 19:05:49.796239 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 19:05:49 crc kubenswrapper[4780]: I0929 19:05:49.865472 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 19:05:49 crc kubenswrapper[4780]: I0929 19:05:49.872819 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 19:05:50 crc kubenswrapper[4780]: I0929 19:05:50.769188 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90472c3-a09d-433c-922b-d164a11636e6" path="/var/lib/kubelet/pods/b90472c3-a09d-433c-922b-d164a11636e6/volumes" Sep 29 19:05:50 crc kubenswrapper[4780]: I0929 19:05:50.770331 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f271f9ca-bced-4144-b779-06e7422d9a63" path="/var/lib/kubelet/pods/f271f9ca-bced-4144-b779-06e7422d9a63/volumes" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.788798 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4fggl"] Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.791905 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerName="barbican-worker" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.791927 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerName="barbican-worker" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.791944 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerName="neutron-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.791950 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerName="neutron-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.791958 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.791964 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.791975 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628b549e-6d99-43d4-94bb-61b457f4c37b" containerName="galera" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.791981 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="628b549e-6d99-43d4-94bb-61b457f4c37b" containerName="galera" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.791991 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server-init" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.791998 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server-init" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792017 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f271f9ca-bced-4144-b779-06e7422d9a63" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792024 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f271f9ca-bced-4144-b779-06e7422d9a63" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792044 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed2917c-127a-4dbd-b951-6b141853e47c" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792050 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed2917c-127a-4dbd-b951-6b141853e47c" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792070 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48191511-38e9-46d2-82f8-77453769927c" containerName="galera" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792075 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="48191511-38e9-46d2-82f8-77453769927c" containerName="galera" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792095 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792101 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api-log" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792112 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" containerName="ovsdbserver-nb" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792118 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" containerName="ovsdbserver-nb" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792128 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622d766f-f43c-434c-9353-2315a6c82ae6" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792136 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="622d766f-f43c-434c-9353-2315a6c82ae6" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792146 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" containerName="nova-cell0-conductor-conductor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792152 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" containerName="nova-cell0-conductor-conductor" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792164 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1d2b75-0893-468d-8365-f08fa8875575" containerName="barbican-keystone-listener-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792171 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1d2b75-0893-468d-8365-f08fa8875575" containerName="barbican-keystone-listener-log" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792186 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="swift-recon-cron" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792194 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="swift-recon-cron" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792207 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628b549e-6d99-43d4-94bb-61b457f4c37b" containerName="mysql-bootstrap" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792213 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="628b549e-6d99-43d4-94bb-61b457f4c37b" containerName="mysql-bootstrap" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792222 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-expirer" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792231 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-expirer" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792245 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed88e38f-cb35-4072-8f9f-1c6ab980ec03" containerName="kube-state-metrics" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792252 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed88e38f-cb35-4072-8f9f-1c6ab980ec03" containerName="kube-state-metrics" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792263 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7373591d-cf39-4674-8b37-449096f6a3b6" containerName="init" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792269 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="7373591d-cf39-4674-8b37-449096f6a3b6" containerName="init" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792278 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90472c3-a09d-433c-922b-d164a11636e6" containerName="rabbitmq" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792284 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90472c3-a09d-433c-922b-d164a11636e6" containerName="rabbitmq" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792296 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerName="ovsdbserver-sb" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792302 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerName="ovsdbserver-sb" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792313 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ee2741-9417-4698-b550-7c596d00d271" containerName="setup-container" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792318 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ee2741-9417-4698-b550-7c596d00d271" containerName="setup-container" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792328 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792335 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-server" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792345 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ef0b7e-a06d-49a2-824e-9f088c267a97" containerName="memcached" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792352 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ef0b7e-a06d-49a2-824e-9f088c267a97" containerName="memcached" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792372 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerName="proxy-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792378 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerName="proxy-server" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792388 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="ceilometer-central-agent" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792396 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="ceilometer-central-agent" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792405 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4fe84d-ff10-4ed2-938a-669c30748336" containerName="keystone-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792412 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4fe84d-ff10-4ed2-938a-669c30748336" containerName="keystone-api" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792420 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792427 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-api" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792436 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="ceilometer-notification-agent" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792443 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="ceilometer-notification-agent" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792454 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-auditor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792461 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-auditor" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792470 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-auditor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792476 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-auditor" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792486 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f061df-a5ff-4db1-b87f-4106a5e56b55" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792493 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f061df-a5ff-4db1-b87f-4106a5e56b55" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792502 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-metadata" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792509 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-metadata" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792520 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ee2741-9417-4698-b550-7c596d00d271" containerName="rabbitmq" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792528 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ee2741-9417-4698-b550-7c596d00d271" containerName="rabbitmq" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792534 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5ccc95-6c2c-4f3c-884b-456cf28d6db4" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792541 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5ccc95-6c2c-4f3c-884b-456cf28d6db4" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792550 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerName="glance-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792556 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerName="glance-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792568 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-replicator" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792575 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-replicator" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792585 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503714fd-6dcf-4b1d-8806-dd78a3e85b7f" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792591 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="503714fd-6dcf-4b1d-8806-dd78a3e85b7f" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792601 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90472c3-a09d-433c-922b-d164a11636e6" containerName="setup-container" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792608 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90472c3-a09d-433c-922b-d164a11636e6" containerName="setup-container" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792614 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792621 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-server" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792628 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6105150b-678d-4925-a981-9a0d75377f32" containerName="placement-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792634 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6105150b-678d-4925-a981-9a0d75377f32" containerName="placement-log" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792646 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-replicator" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792652 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-replicator" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792659 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-replicator" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792665 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-replicator" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792673 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="rsync" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792679 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="rsync" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792689 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="openstack-network-exporter" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792695 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="openstack-network-exporter" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792702 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792708 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api-log" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792719 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerName="openstack-network-exporter" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792726 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerName="openstack-network-exporter" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792737 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792743 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-log" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792753 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerName="neutron-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792759 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerName="neutron-api" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792766 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792772 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792779 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792785 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792793 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerName="probe" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792799 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerName="probe" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792808 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec846e3f-c11b-4818-a15b-9f855ed48a56" containerName="nova-scheduler-scheduler" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792813 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec846e3f-c11b-4818-a15b-9f855ed48a56" containerName="nova-scheduler-scheduler" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792819 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-updater" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792825 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-updater" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792832 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f2aaf8-27dc-428c-a387-d63424889230" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792839 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f2aaf8-27dc-428c-a387-d63424889230" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792848 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792854 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792861 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792868 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-server" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792876 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="ovn-northd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792883 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="ovn-northd" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792891 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerName="proxy-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792897 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerName="proxy-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792903 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc401926-3969-448c-9910-22572fecb168" containerName="nova-cell1-conductor-conductor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792909 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc401926-3969-448c-9910-22572fecb168" containerName="nova-cell1-conductor-conductor" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792920 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerName="glance-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792926 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerName="glance-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792934 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7373591d-cf39-4674-8b37-449096f6a3b6" containerName="dnsmasq-dns" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792941 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="7373591d-cf39-4674-8b37-449096f6a3b6" containerName="dnsmasq-dns" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792952 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerName="barbican-worker-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792969 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerName="barbican-worker-log" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792980 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerName="glance-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.792989 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerName="glance-log" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.792996 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="proxy-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793003 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="proxy-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793012 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-reaper" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793018 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-reaper" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793026 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerName="glance-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793033 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerName="glance-log" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793040 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48191511-38e9-46d2-82f8-77453769927c" containerName="mysql-bootstrap" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793049 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="48191511-38e9-46d2-82f8-77453769927c" containerName="mysql-bootstrap" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793059 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793065 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-log" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793076 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" containerName="openstack-network-exporter" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793100 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" containerName="openstack-network-exporter" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793110 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-auditor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793118 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-auditor" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793130 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1d2b75-0893-468d-8365-f08fa8875575" containerName="barbican-keystone-listener" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793139 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1d2b75-0893-468d-8365-f08fa8875575" containerName="barbican-keystone-listener" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793151 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6105150b-678d-4925-a981-9a0d75377f32" containerName="placement-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793157 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="6105150b-678d-4925-a981-9a0d75377f32" containerName="placement-api" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793168 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="sg-core" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793175 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="sg-core" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793185 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a8fa86-9475-490a-9c9f-09233413eab5" containerName="ovn-controller" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793192 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a8fa86-9475-490a-9c9f-09233413eab5" containerName="ovn-controller" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793203 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-updater" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793210 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-updater" Sep 29 19:06:08 crc kubenswrapper[4780]: E0929 19:06:08.793219 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerName="cinder-scheduler" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793227 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerName="cinder-scheduler" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793388 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-updater" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793400 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4fe84d-ff10-4ed2-938a-669c30748336" containerName="keystone-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793407 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793413 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerName="probe" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793423 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" containerName="openstack-network-exporter" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793434 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793442 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="proxy-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793451 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6105150b-678d-4925-a981-9a0d75377f32" containerName="placement-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793459 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="622d766f-f43c-434c-9353-2315a6c82ae6" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793467 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cf0d8b-ee89-4bed-9b8b-bf902f1f3e0d" containerName="cinder-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793474 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793485 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-replicator" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793495 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-reaper" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793502 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerName="glance-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793508 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-auditor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793513 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerName="barbican-worker" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793523 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="ovn-northd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793532 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc401926-3969-448c-9910-22572fecb168" containerName="nova-cell1-conductor-conductor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793541 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1d2b75-0893-468d-8365-f08fa8875575" containerName="barbican-keystone-listener-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793550 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ee2741-9417-4698-b550-7c596d00d271" containerName="rabbitmq" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793557 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-auditor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793564 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1d2b75-0893-468d-8365-f08fa8875575" containerName="barbican-keystone-listener" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793572 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f2aaf8-27dc-428c-a387-d63424889230" containerName="nova-cell1-novncproxy-novncproxy" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793580 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5ccc95-6c2c-4f3c-884b-456cf28d6db4" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793586 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ef0b7e-a06d-49a2-824e-9f088c267a97" containerName="memcached" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793592 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-replicator" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793600 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a8fa86-9475-490a-9c9f-09233413eab5" containerName="ovn-controller" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793608 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed88e38f-cb35-4072-8f9f-1c6ab980ec03" containerName="kube-state-metrics" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793619 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8150bb34-1bc0-4c45-92f8-9d8d04f611e3" containerName="barbican-worker-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793631 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed2917c-127a-4dbd-b951-6b141853e47c" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793638 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="628b549e-6d99-43d4-94bb-61b457f4c37b" containerName="galera" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793649 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793661 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerName="proxy-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793669 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerName="glance-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793681 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793693 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2903cdd8-3ab5-4c85-892c-2139eb0bde7c" containerName="cinder-scheduler" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793705 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-expirer" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793711 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7f300da-65dd-4c6e-ae4a-63b797768651" containerName="glance-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793721 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f061df-a5ff-4db1-b87f-4106a5e56b55" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793731 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="ceilometer-central-agent" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793743 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6422eb63-373a-4b79-88b0-ddd623f7bd79" containerName="proxy-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793755 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerName="openstack-network-exporter" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793766 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="sg-core" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793774 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="503714fd-6dcf-4b1d-8806-dd78a3e85b7f" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793787 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-auditor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793798 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="container-replicator" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793807 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14f7a20-d45e-4662-b0db-4af394c7daed" containerName="glance-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793819 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="object-updater" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793834 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerName="neutron-httpd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793845 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovs-vswitchd" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793855 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb1a5cd-ae45-4e38-bcf7-40cd5a76b7d8" containerName="neutron-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793862 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-metadata" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793869 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c91af49-2adc-47a1-892c-82da3b338492" containerName="ovsdb-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793878 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="account-server" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793885 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="7373591d-cf39-4674-8b37-449096f6a3b6" containerName="dnsmasq-dns" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793894 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f271f9ca-bced-4144-b779-06e7422d9a63" containerName="mariadb-account-delete" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793902 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8611dff0-9ad1-4bba-b687-958d7e887859" containerName="ovsdbserver-nb" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793910 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="48191511-38e9-46d2-82f8-77453769927c" containerName="galera" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793921 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b9c388-0f74-42fc-bf3d-711322b976d8" containerName="ovsdbserver-sb" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793929 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec846e3f-c11b-4818-a15b-9f855ed48a56" containerName="nova-scheduler-scheduler" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793936 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="swift-recon-cron" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793943 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa6b4d2f-2f81-44fd-8c76-2aa6204209c3" containerName="nova-cell0-conductor-conductor" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793952 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42e5bce-9395-4758-8121-35408b6df2e2" containerName="ceilometer-notification-agent" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793959 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6105150b-678d-4925-a981-9a0d75377f32" containerName="placement-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793969 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90472c3-a09d-433c-922b-d164a11636e6" containerName="rabbitmq" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793977 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c538b0f-23b3-440d-9775-5f33f7badfd4" containerName="barbican-api" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793985 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="792eb9b5-5b6a-4c61-bc3f-8ab53d64a248" containerName="nova-metadata-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.793992 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3683c554-eec7-4825-8972-0445faf15a23" containerName="openstack-network-exporter" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.794001 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="02521078-2e58-4ce2-bc12-0b6c3b2ed878" containerName="nova-api-log" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.794008 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6e9f4-21b9-4a8f-aa01-3f0924013fe1" containerName="rsync" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.795872 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4fggl"] Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.795935 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.977232 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84-utilities\") pod \"certified-operators-4fggl\" (UID: \"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84\") " pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.977381 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4hg\" (UniqueName: \"kubernetes.io/projected/6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84-kube-api-access-9p4hg\") pod \"certified-operators-4fggl\" (UID: \"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84\") " pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:08 crc kubenswrapper[4780]: I0929 19:06:08.977439 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84-catalog-content\") pod \"certified-operators-4fggl\" (UID: \"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84\") " pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.079219 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4hg\" (UniqueName: \"kubernetes.io/projected/6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84-kube-api-access-9p4hg\") pod \"certified-operators-4fggl\" (UID: \"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84\") " pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.079311 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84-catalog-content\") pod \"certified-operators-4fggl\" (UID: \"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84\") " pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.079370 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84-utilities\") pod \"certified-operators-4fggl\" (UID: \"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84\") " pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.079967 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84-utilities\") pod \"certified-operators-4fggl\" (UID: \"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84\") " pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.080199 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84-catalog-content\") pod \"certified-operators-4fggl\" (UID: \"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84\") " pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.107416 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4hg\" (UniqueName: \"kubernetes.io/projected/6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84-kube-api-access-9p4hg\") pod \"certified-operators-4fggl\" (UID: \"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84\") " pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.135387 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.623144 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4fggl"] Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.995595 4780 generic.go:334] "Generic (PLEG): container finished" podID="6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84" containerID="60eef002c474db104058a8526179c26d021ffe134277fffb47f276ca860a52d9" exitCode=0 Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.995713 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fggl" event={"ID":"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84","Type":"ContainerDied","Data":"60eef002c474db104058a8526179c26d021ffe134277fffb47f276ca860a52d9"} Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.995964 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fggl" event={"ID":"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84","Type":"ContainerStarted","Data":"8e65f2b991b31ceeeacf0aa47a1c3c01e37e8f371d30393367eb6f52eb13db2d"} Sep 29 19:06:09 crc kubenswrapper[4780]: I0929 19:06:09.997534 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 19:06:14 crc kubenswrapper[4780]: I0929 19:06:14.052023 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fggl" event={"ID":"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84","Type":"ContainerStarted","Data":"8a282bad45566b9372df9df027e2a296be177c0ccac59b07771a1e6077a8a915"} Sep 29 19:06:15 crc kubenswrapper[4780]: I0929 19:06:15.083076 4780 generic.go:334] "Generic (PLEG): container finished" podID="6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84" containerID="8a282bad45566b9372df9df027e2a296be177c0ccac59b07771a1e6077a8a915" exitCode=0 Sep 29 19:06:15 crc kubenswrapper[4780]: I0929 19:06:15.083179 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fggl" event={"ID":"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84","Type":"ContainerDied","Data":"8a282bad45566b9372df9df027e2a296be177c0ccac59b07771a1e6077a8a915"} Sep 29 19:06:16 crc kubenswrapper[4780]: I0929 19:06:16.096581 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fggl" event={"ID":"6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84","Type":"ContainerStarted","Data":"a4be70fe5767d5950901b89f92944ea78078cd33e3469c48091b663d765698b0"} Sep 29 19:06:16 crc kubenswrapper[4780]: I0929 19:06:16.125713 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4fggl" podStartSLOduration=2.606755748 podStartE2EDuration="8.125686886s" podCreationTimestamp="2025-09-29 19:06:08 +0000 UTC" firstStartedPulling="2025-09-29 19:06:09.997336062 +0000 UTC m=+1369.945634106" lastFinishedPulling="2025-09-29 19:06:15.5162672 +0000 UTC m=+1375.464565244" observedRunningTime="2025-09-29 19:06:16.119925596 +0000 UTC m=+1376.068223640" watchObservedRunningTime="2025-09-29 19:06:16.125686886 +0000 UTC m=+1376.073984920" Sep 29 19:06:19 crc kubenswrapper[4780]: I0929 19:06:19.136197 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:19 crc kubenswrapper[4780]: I0929 19:06:19.136564 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:19 crc kubenswrapper[4780]: I0929 19:06:19.195903 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.185638 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4fggl" Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.274870 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4fggl"] Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.319528 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfstc"] Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.320359 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sfstc" podUID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerName="registry-server" containerID="cri-o://2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833" gracePeriod=2 Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.816707 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfstc" Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.982249 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-catalog-content\") pod \"bc2a78c6-628f-489f-aa89-435224f9ef3e\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.982385 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-utilities\") pod \"bc2a78c6-628f-489f-aa89-435224f9ef3e\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.982600 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcvrp\" (UniqueName: \"kubernetes.io/projected/bc2a78c6-628f-489f-aa89-435224f9ef3e-kube-api-access-qcvrp\") pod \"bc2a78c6-628f-489f-aa89-435224f9ef3e\" (UID: \"bc2a78c6-628f-489f-aa89-435224f9ef3e\") " Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.983107 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-utilities" (OuterVolumeSpecName: "utilities") pod "bc2a78c6-628f-489f-aa89-435224f9ef3e" (UID: "bc2a78c6-628f-489f-aa89-435224f9ef3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.983348 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:06:20 crc kubenswrapper[4780]: I0929 19:06:20.990251 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2a78c6-628f-489f-aa89-435224f9ef3e-kube-api-access-qcvrp" (OuterVolumeSpecName: "kube-api-access-qcvrp") pod "bc2a78c6-628f-489f-aa89-435224f9ef3e" (UID: "bc2a78c6-628f-489f-aa89-435224f9ef3e"). InnerVolumeSpecName "kube-api-access-qcvrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.030066 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc2a78c6-628f-489f-aa89-435224f9ef3e" (UID: "bc2a78c6-628f-489f-aa89-435224f9ef3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.084657 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcvrp\" (UniqueName: \"kubernetes.io/projected/bc2a78c6-628f-489f-aa89-435224f9ef3e-kube-api-access-qcvrp\") on node \"crc\" DevicePath \"\"" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.084706 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc2a78c6-628f-489f-aa89-435224f9ef3e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.149393 4780 generic.go:334] "Generic (PLEG): container finished" podID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerID="2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833" exitCode=0 Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.149473 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfstc" event={"ID":"bc2a78c6-628f-489f-aa89-435224f9ef3e","Type":"ContainerDied","Data":"2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833"} Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.149533 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfstc" event={"ID":"bc2a78c6-628f-489f-aa89-435224f9ef3e","Type":"ContainerDied","Data":"8a1104bfd0c2d9ab5d87200dfc5152e8eda157699ce5d06c2451915e728f0746"} Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.149560 4780 scope.go:117] "RemoveContainer" containerID="2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.149481 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfstc" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.185784 4780 scope.go:117] "RemoveContainer" containerID="bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.187103 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfstc"] Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.193553 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sfstc"] Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.212687 4780 scope.go:117] "RemoveContainer" containerID="402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.240363 4780 scope.go:117] "RemoveContainer" containerID="2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833" Sep 29 19:06:21 crc kubenswrapper[4780]: E0929 19:06:21.240859 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833\": container with ID starting with 2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833 not found: ID does not exist" containerID="2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.240892 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833"} err="failed to get container status \"2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833\": rpc error: code = NotFound desc = could not find container \"2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833\": container with ID starting with 2240144c3cb8c1c7a958157558c258fff7670cc452796314b628b60c2a176833 not found: ID does not exist" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.240919 4780 scope.go:117] "RemoveContainer" containerID="bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352" Sep 29 19:06:21 crc kubenswrapper[4780]: E0929 19:06:21.241630 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352\": container with ID starting with bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352 not found: ID does not exist" containerID="bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.241692 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352"} err="failed to get container status \"bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352\": rpc error: code = NotFound desc = could not find container \"bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352\": container with ID starting with bd570af7817e047a03c3bb6df9dabd91666bfac75827142fdd61580d320fc352 not found: ID does not exist" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.241733 4780 scope.go:117] "RemoveContainer" containerID="402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e" Sep 29 19:06:21 crc kubenswrapper[4780]: E0929 19:06:21.242193 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e\": container with ID starting with 402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e not found: ID does not exist" containerID="402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e" Sep 29 19:06:21 crc kubenswrapper[4780]: I0929 19:06:21.242265 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e"} err="failed to get container status \"402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e\": rpc error: code = NotFound desc = could not find container \"402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e\": container with ID starting with 402b6a254751ee56a26c88e8766ab5a9f04884b80ed645302d0869530d2e8b7e not found: ID does not exist" Sep 29 19:06:22 crc kubenswrapper[4780]: I0929 19:06:22.764494 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2a78c6-628f-489f-aa89-435224f9ef3e" path="/var/lib/kubelet/pods/bc2a78c6-628f-489f-aa89-435224f9ef3e/volumes" Sep 29 19:06:23 crc kubenswrapper[4780]: I0929 19:06:23.544312 4780 scope.go:117] "RemoveContainer" containerID="9b3b49adc067cecc8ca3e59bbfde4c5f491295762d4a36a1f8d029d61c87f0ca" Sep 29 19:06:23 crc kubenswrapper[4780]: I0929 19:06:23.582214 4780 scope.go:117] "RemoveContainer" containerID="fa44c2b6e56600dfb6c99d6fb0e419237762ff70fabe663a6e3f18eded510c50" Sep 29 19:06:23 crc kubenswrapper[4780]: I0929 19:06:23.607495 4780 scope.go:117] "RemoveContainer" containerID="7699a71ca39a3c8494c9355a46a16bee6be5ac87a42e0ea8defff31cba877178" Sep 29 19:06:23 crc kubenswrapper[4780]: I0929 19:06:23.638426 4780 scope.go:117] "RemoveContainer" containerID="51ea86d45c906caef27896e8dcb0cc239773856c5d0c5bb99056af02148d0f04" Sep 29 19:07:03 crc kubenswrapper[4780]: I0929 19:07:03.223069 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:07:03 crc kubenswrapper[4780]: I0929 19:07:03.223649 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:07:23 crc kubenswrapper[4780]: I0929 19:07:23.810164 4780 scope.go:117] "RemoveContainer" containerID="a88ecd9869527db3e1e469d6ec77eea8b98e86fb8ec1d9f3b6c7e933897617ef" Sep 29 19:07:23 crc kubenswrapper[4780]: I0929 19:07:23.845823 4780 scope.go:117] "RemoveContainer" containerID="d5e0636251e3ed27e8f1f18c1f0cea8ea6425de2eb38557afac065bde4e57202" Sep 29 19:07:23 crc kubenswrapper[4780]: I0929 19:07:23.864440 4780 scope.go:117] "RemoveContainer" containerID="13e8ab18182942ec7c1e17fb68cc0e8c23dbfb081acf5053cae6c70e5e243ff9" Sep 29 19:07:23 crc kubenswrapper[4780]: I0929 19:07:23.894511 4780 scope.go:117] "RemoveContainer" containerID="bdc5bfe5a9b796768563b3c0a298999c5127cb243c728936fc569fce10667de8" Sep 29 19:07:23 crc kubenswrapper[4780]: I0929 19:07:23.931168 4780 scope.go:117] "RemoveContainer" containerID="5cc733f5ad77b86d578f66e4023e8bf1bfb45a2b08306550d0ed413b92d6dfaf" Sep 29 19:07:23 crc kubenswrapper[4780]: I0929 19:07:23.949371 4780 scope.go:117] "RemoveContainer" containerID="d83a512957c0253e871e9228adf6c0ea9b12c0719bac9544b8bf3ce9dc88c419" Sep 29 19:07:23 crc kubenswrapper[4780]: I0929 19:07:23.986154 4780 scope.go:117] "RemoveContainer" containerID="9745fc2edb04b632dafc0c294af08ddefea82649ef893b0c2f3e77cb7ff298cf" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.003062 4780 scope.go:117] "RemoveContainer" containerID="926f9138263106966b8c488fd9a2f55e330d7666c71e67c054a006eecb80d715" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.047229 4780 scope.go:117] "RemoveContainer" containerID="35e5b426f921a88251cd1fd6a84ce25592033487febb99da7b9b4dcfe094b1cb" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.075628 4780 scope.go:117] "RemoveContainer" containerID="e9fbd9141d0e45dd131cdf8af3a6a466f887381720e1ef67a10b7d738ced7a6d" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.094237 4780 scope.go:117] "RemoveContainer" containerID="9fe7a9ed34fb1e01649f4cef920a664af9f8e56918c80aa9fd1b9405170031d8" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.111432 4780 scope.go:117] "RemoveContainer" containerID="dd976228546b3979a349b28fd87ae9da5e203094d0bc0353d3f2409a6ca2e748" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.140420 4780 scope.go:117] "RemoveContainer" containerID="b9bc8661e57232e341261e8e572a959de5b4986fba652b9ef4160b4a3d945a5f" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.161927 4780 scope.go:117] "RemoveContainer" containerID="95bb319bec1c179b889b9f0a8e5359054dc1c3b89c4e646e5d48ce7444e5d055" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.192299 4780 scope.go:117] "RemoveContainer" containerID="c1a07f5d0a702a2af8115f7cb08035b3389648b80d070cec59a16da14f20fe62" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.210915 4780 scope.go:117] "RemoveContainer" containerID="a12d0485b5530686d7a1231632f902193e840b516f0e2c1682e9b5b329c06003" Sep 29 19:07:24 crc kubenswrapper[4780]: I0929 19:07:24.241875 4780 scope.go:117] "RemoveContainer" containerID="4aa1ef08d742f38c18d483e9a34bfc178c7346f4dc1f342627977887f315213f" Sep 29 19:07:33 crc kubenswrapper[4780]: I0929 19:07:33.224365 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:07:33 crc kubenswrapper[4780]: I0929 19:07:33.225508 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:08:03 crc kubenswrapper[4780]: I0929 19:08:03.223520 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:08:03 crc kubenswrapper[4780]: I0929 19:08:03.224356 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:08:03 crc kubenswrapper[4780]: I0929 19:08:03.224423 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:08:03 crc kubenswrapper[4780]: I0929 19:08:03.225230 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:08:03 crc kubenswrapper[4780]: I0929 19:08:03.225305 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" gracePeriod=600 Sep 29 19:08:03 crc kubenswrapper[4780]: E0929 19:08:03.358087 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:08:04 crc kubenswrapper[4780]: I0929 19:08:04.209964 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" exitCode=0 Sep 29 19:08:04 crc kubenswrapper[4780]: I0929 19:08:04.210034 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d"} Sep 29 19:08:04 crc kubenswrapper[4780]: I0929 19:08:04.210161 4780 scope.go:117] "RemoveContainer" containerID="158b296bb0b637f86ad18136c175af2360d991a7d6ae9ac64ec4dd848661493a" Sep 29 19:08:04 crc kubenswrapper[4780]: I0929 19:08:04.211993 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:08:04 crc kubenswrapper[4780]: E0929 19:08:04.212490 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:08:17 crc kubenswrapper[4780]: I0929 19:08:17.753232 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:08:17 crc kubenswrapper[4780]: E0929 19:08:17.754160 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:08:24 crc kubenswrapper[4780]: I0929 19:08:24.488139 4780 scope.go:117] "RemoveContainer" containerID="2f710c5ce82ca39295b8a385c093185e1904e19720dae5eaeaae61d9187c8809" Sep 29 19:08:24 crc kubenswrapper[4780]: I0929 19:08:24.523026 4780 scope.go:117] "RemoveContainer" containerID="b7c50fc2d9534221112ba9758fee8b52356d9efe5f5ed3fb8c0432498719f180" Sep 29 19:08:24 crc kubenswrapper[4780]: I0929 19:08:24.547188 4780 scope.go:117] "RemoveContainer" containerID="0df594c76250ec5239387dd545c954b88d180c614903ce4cff7730bbbf798e32" Sep 29 19:08:24 crc kubenswrapper[4780]: I0929 19:08:24.571119 4780 scope.go:117] "RemoveContainer" containerID="4b807f34a3c65b6d836e3bd255f8320430de3cf2180ee8e33b572ba6e6717b3b" Sep 29 19:08:24 crc kubenswrapper[4780]: I0929 19:08:24.605888 4780 scope.go:117] "RemoveContainer" containerID="ada0d3f9808c1bbda5295b4e75f3aac1b8c137677fb1e9e078bfbd6a6f89a728" Sep 29 19:08:24 crc kubenswrapper[4780]: I0929 19:08:24.651144 4780 scope.go:117] "RemoveContainer" containerID="8cdf366e564c41077cac425fb60d05141765f8d392ad2a68245952b02c84e442" Sep 29 19:08:24 crc kubenswrapper[4780]: I0929 19:08:24.675041 4780 scope.go:117] "RemoveContainer" containerID="0affd5ee1b6271df452bc20498cfdf2af34f4084addeda0fabf4b600b2de3adf" Sep 29 19:08:28 crc kubenswrapper[4780]: I0929 19:08:28.754088 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:08:28 crc kubenswrapper[4780]: E0929 19:08:28.755306 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:08:39 crc kubenswrapper[4780]: I0929 19:08:39.752982 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:08:39 crc kubenswrapper[4780]: E0929 19:08:39.754776 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:08:51 crc kubenswrapper[4780]: I0929 19:08:51.752517 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:08:51 crc kubenswrapper[4780]: E0929 19:08:51.753228 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:09:04 crc kubenswrapper[4780]: I0929 19:09:04.752993 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:09:04 crc kubenswrapper[4780]: E0929 19:09:04.753914 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:09:16 crc kubenswrapper[4780]: I0929 19:09:16.754153 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:09:16 crc kubenswrapper[4780]: E0929 19:09:16.755404 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:09:24 crc kubenswrapper[4780]: I0929 19:09:24.812858 4780 scope.go:117] "RemoveContainer" containerID="739143154f41eccfb13a2b48adb19e687f9f167c8167b59c2ccf652c349ef90e" Sep 29 19:09:24 crc kubenswrapper[4780]: I0929 19:09:24.851959 4780 scope.go:117] "RemoveContainer" containerID="3b46613406d996c1017570f6009c858346fa16fd0adca84ce6a74bbfd6caff42" Sep 29 19:09:24 crc kubenswrapper[4780]: I0929 19:09:24.888884 4780 scope.go:117] "RemoveContainer" containerID="ce83abddb679064c04606622289ed5fd5649a02935122d4ae1834743018dd57e" Sep 29 19:09:24 crc kubenswrapper[4780]: I0929 19:09:24.921013 4780 scope.go:117] "RemoveContainer" containerID="3a9cbda79fe816a7955ce7cbcd0a4685df1c531b028a0f5b21c8ed5f82a66202" Sep 29 19:09:24 crc kubenswrapper[4780]: I0929 19:09:24.953885 4780 scope.go:117] "RemoveContainer" containerID="30a39734b4259b0e3f9e44b9d713ecae2d2861e0a690d22cc730cca75a36853d" Sep 29 19:09:24 crc kubenswrapper[4780]: I0929 19:09:24.983921 4780 scope.go:117] "RemoveContainer" containerID="97e685ab1f93a82fa5ae729aab612fc064f95fcdbd21f62041a2a74c7c2ea186" Sep 29 19:09:25 crc kubenswrapper[4780]: I0929 19:09:25.012262 4780 scope.go:117] "RemoveContainer" containerID="2bed75a72ebe1e8129dcb4991c90de2501dc80eaaa5a5d00e170c5bcd8aefd4f" Sep 29 19:09:25 crc kubenswrapper[4780]: I0929 19:09:25.043317 4780 scope.go:117] "RemoveContainer" containerID="032036dc95afbee3502dbfcf272cd78266cf379155bf2e1bb4815adf6cb65e9a" Sep 29 19:09:25 crc kubenswrapper[4780]: I0929 19:09:25.068702 4780 scope.go:117] "RemoveContainer" containerID="fbbd56e2c35a1f0f783bb787a2c0d549b1f65db0f5817736155304a6429851de" Sep 29 19:09:25 crc kubenswrapper[4780]: I0929 19:09:25.106617 4780 scope.go:117] "RemoveContainer" containerID="dbcf9928788092ab977ae712ae4612aa67a6f7d49fb7301d908346b1aca4b563" Sep 29 19:09:30 crc kubenswrapper[4780]: I0929 19:09:30.756662 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:09:30 crc kubenswrapper[4780]: E0929 19:09:30.757203 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:09:43 crc kubenswrapper[4780]: I0929 19:09:43.753766 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:09:43 crc kubenswrapper[4780]: E0929 19:09:43.754434 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:09:58 crc kubenswrapper[4780]: I0929 19:09:58.753112 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:09:58 crc kubenswrapper[4780]: E0929 19:09:58.753846 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:10:11 crc kubenswrapper[4780]: I0929 19:10:11.753326 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:10:11 crc kubenswrapper[4780]: E0929 19:10:11.754496 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:10:23 crc kubenswrapper[4780]: I0929 19:10:23.753395 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:10:23 crc kubenswrapper[4780]: E0929 19:10:23.754094 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:10:25 crc kubenswrapper[4780]: I0929 19:10:25.265401 4780 scope.go:117] "RemoveContainer" containerID="2206fdfda1b3679c9eaab7892ccf4c32611624a3996175a4dd0502159b261a25" Sep 29 19:10:25 crc kubenswrapper[4780]: I0929 19:10:25.286964 4780 scope.go:117] "RemoveContainer" containerID="f88db872bb531d67943f47affb487b5a77c5ff64bdf19d2564052e453ae34187" Sep 29 19:10:25 crc kubenswrapper[4780]: I0929 19:10:25.304880 4780 scope.go:117] "RemoveContainer" containerID="1bf3800786032f687dfb373cbc1d24ace1919441397847f347217bf7a840db61" Sep 29 19:10:25 crc kubenswrapper[4780]: I0929 19:10:25.321837 4780 scope.go:117] "RemoveContainer" containerID="bedf0c2e64d32086726c83fc23935b9eb7e3b0cccc9c1ff45f3505778e088224" Sep 29 19:10:25 crc kubenswrapper[4780]: I0929 19:10:25.370611 4780 scope.go:117] "RemoveContainer" containerID="f64e558c1911bea7506b2cdd5c000f9c4c3d8816f4e4b6adc9002538b83090a4" Sep 29 19:10:25 crc kubenswrapper[4780]: I0929 19:10:25.394861 4780 scope.go:117] "RemoveContainer" containerID="667ea135113b961c6b2a36ada8f212c39fb66bc12bbf320d1a2bbbed6a920a4c" Sep 29 19:10:25 crc kubenswrapper[4780]: I0929 19:10:25.435334 4780 scope.go:117] "RemoveContainer" containerID="60308fe91edf8a4076c678529c97807d46bf256eda44506a25d107905d15a376" Sep 29 19:10:25 crc kubenswrapper[4780]: I0929 19:10:25.471574 4780 scope.go:117] "RemoveContainer" containerID="a13ab8e97bfc1c433e41ba1fdbdc614073a33b3747ee1e7b9e9cd3cb214ce595" Sep 29 19:10:36 crc kubenswrapper[4780]: I0929 19:10:36.753790 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:10:36 crc kubenswrapper[4780]: E0929 19:10:36.756196 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:10:47 crc kubenswrapper[4780]: I0929 19:10:47.753261 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:10:47 crc kubenswrapper[4780]: E0929 19:10:47.753836 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:11:00 crc kubenswrapper[4780]: I0929 19:11:00.756500 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:11:00 crc kubenswrapper[4780]: E0929 19:11:00.757465 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:11:15 crc kubenswrapper[4780]: I0929 19:11:15.753912 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:11:15 crc kubenswrapper[4780]: E0929 19:11:15.755629 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:11:25 crc kubenswrapper[4780]: I0929 19:11:25.553176 4780 scope.go:117] "RemoveContainer" containerID="06b644ef5b1ab2aed1b81290fa9144d38c32c66e7d427c70b6dfb41dd252e0ac" Sep 29 19:11:25 crc kubenswrapper[4780]: I0929 19:11:25.588395 4780 scope.go:117] "RemoveContainer" containerID="243f2148e91378364f15ab121c1c59d743e4fb88de99cd62a3d13f1cfef6c462" Sep 29 19:11:25 crc kubenswrapper[4780]: I0929 19:11:25.615359 4780 scope.go:117] "RemoveContainer" containerID="5819364439d5cb95e55e3ab9534211d4daffea11f49034a2d033c40c6ec821a9" Sep 29 19:11:27 crc kubenswrapper[4780]: I0929 19:11:27.753371 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:11:27 crc kubenswrapper[4780]: E0929 19:11:27.754788 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:11:39 crc kubenswrapper[4780]: I0929 19:11:39.753588 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:11:39 crc kubenswrapper[4780]: E0929 19:11:39.754339 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:11:52 crc kubenswrapper[4780]: I0929 19:11:52.753707 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:11:52 crc kubenswrapper[4780]: E0929 19:11:52.754696 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:12:05 crc kubenswrapper[4780]: I0929 19:12:05.752960 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:12:05 crc kubenswrapper[4780]: E0929 19:12:05.753806 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:12:17 crc kubenswrapper[4780]: I0929 19:12:17.752679 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:12:17 crc kubenswrapper[4780]: E0929 19:12:17.753408 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:12:28 crc kubenswrapper[4780]: I0929 19:12:28.752884 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:12:28 crc kubenswrapper[4780]: E0929 19:12:28.753750 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:12:40 crc kubenswrapper[4780]: I0929 19:12:40.758399 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:12:40 crc kubenswrapper[4780]: E0929 19:12:40.759419 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:12:54 crc kubenswrapper[4780]: I0929 19:12:54.752839 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:12:54 crc kubenswrapper[4780]: E0929 19:12:54.754302 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:13:05 crc kubenswrapper[4780]: I0929 19:13:05.753416 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:13:06 crc kubenswrapper[4780]: I0929 19:13:06.869114 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"fed7f1d0dd0c8bd2742028ee8d1b345741db16a32e9dad5b20e711baddf17493"} Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.148075 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq"] Sep 29 19:15:00 crc kubenswrapper[4780]: E0929 19:15:00.149009 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerName="registry-server" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.149026 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerName="registry-server" Sep 29 19:15:00 crc kubenswrapper[4780]: E0929 19:15:00.149061 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerName="extract-utilities" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.149081 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerName="extract-utilities" Sep 29 19:15:00 crc kubenswrapper[4780]: E0929 19:15:00.149095 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerName="extract-content" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.149104 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerName="extract-content" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.149377 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2a78c6-628f-489f-aa89-435224f9ef3e" containerName="registry-server" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.150522 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.153563 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.154016 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.165512 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq"] Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.190169 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-config-volume\") pod \"collect-profiles-29319555-9tlfq\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.190286 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjb5s\" (UniqueName: \"kubernetes.io/projected/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-kube-api-access-xjb5s\") pod \"collect-profiles-29319555-9tlfq\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.190347 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-secret-volume\") pod \"collect-profiles-29319555-9tlfq\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.291634 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-config-volume\") pod \"collect-profiles-29319555-9tlfq\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.291743 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjb5s\" (UniqueName: \"kubernetes.io/projected/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-kube-api-access-xjb5s\") pod \"collect-profiles-29319555-9tlfq\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.291799 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-secret-volume\") pod \"collect-profiles-29319555-9tlfq\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.292678 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-config-volume\") pod \"collect-profiles-29319555-9tlfq\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.299040 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-secret-volume\") pod \"collect-profiles-29319555-9tlfq\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.310672 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjb5s\" (UniqueName: \"kubernetes.io/projected/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-kube-api-access-xjb5s\") pod \"collect-profiles-29319555-9tlfq\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.479913 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.736180 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq"] Sep 29 19:15:00 crc kubenswrapper[4780]: I0929 19:15:00.781921 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" event={"ID":"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0","Type":"ContainerStarted","Data":"b296bf635d102ab8e10fa4726891fe485471e489b0d345c7dbccca3728fd7983"} Sep 29 19:15:01 crc kubenswrapper[4780]: I0929 19:15:01.790527 4780 generic.go:334] "Generic (PLEG): container finished" podID="4eac5c32-b786-4c2a-a97a-ba4400d3a3b0" containerID="0f2249c0677d88700c503b59bc6a9fddd94a5e159fe8eb6138551bbc907b4659" exitCode=0 Sep 29 19:15:01 crc kubenswrapper[4780]: I0929 19:15:01.790675 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" event={"ID":"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0","Type":"ContainerDied","Data":"0f2249c0677d88700c503b59bc6a9fddd94a5e159fe8eb6138551bbc907b4659"} Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.087166 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.244292 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-config-volume\") pod \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.244395 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjb5s\" (UniqueName: \"kubernetes.io/projected/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-kube-api-access-xjb5s\") pod \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.244438 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-secret-volume\") pod \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\" (UID: \"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0\") " Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.245826 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-config-volume" (OuterVolumeSpecName: "config-volume") pod "4eac5c32-b786-4c2a-a97a-ba4400d3a3b0" (UID: "4eac5c32-b786-4c2a-a97a-ba4400d3a3b0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.250390 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-kube-api-access-xjb5s" (OuterVolumeSpecName: "kube-api-access-xjb5s") pod "4eac5c32-b786-4c2a-a97a-ba4400d3a3b0" (UID: "4eac5c32-b786-4c2a-a97a-ba4400d3a3b0"). InnerVolumeSpecName "kube-api-access-xjb5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.254511 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4eac5c32-b786-4c2a-a97a-ba4400d3a3b0" (UID: "4eac5c32-b786-4c2a-a97a-ba4400d3a3b0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.346439 4780 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.346486 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjb5s\" (UniqueName: \"kubernetes.io/projected/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-kube-api-access-xjb5s\") on node \"crc\" DevicePath \"\"" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.346503 4780 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.507027 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-45t94"] Sep 29 19:15:03 crc kubenswrapper[4780]: E0929 19:15:03.507323 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eac5c32-b786-4c2a-a97a-ba4400d3a3b0" containerName="collect-profiles" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.507342 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eac5c32-b786-4c2a-a97a-ba4400d3a3b0" containerName="collect-profiles" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.507529 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eac5c32-b786-4c2a-a97a-ba4400d3a3b0" containerName="collect-profiles" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.508523 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.536683 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-45t94"] Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.651243 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml9wd\" (UniqueName: \"kubernetes.io/projected/a5564744-c70a-4f51-a18f-86c77f435d6b-kube-api-access-ml9wd\") pod \"redhat-marketplace-45t94\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.651305 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-utilities\") pod \"redhat-marketplace-45t94\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.651336 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-catalog-content\") pod \"redhat-marketplace-45t94\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.752432 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml9wd\" (UniqueName: \"kubernetes.io/projected/a5564744-c70a-4f51-a18f-86c77f435d6b-kube-api-access-ml9wd\") pod \"redhat-marketplace-45t94\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.752519 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-utilities\") pod \"redhat-marketplace-45t94\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.752562 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-catalog-content\") pod \"redhat-marketplace-45t94\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.753578 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-utilities\") pod \"redhat-marketplace-45t94\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.753653 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-catalog-content\") pod \"redhat-marketplace-45t94\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.769509 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml9wd\" (UniqueName: \"kubernetes.io/projected/a5564744-c70a-4f51-a18f-86c77f435d6b-kube-api-access-ml9wd\") pod \"redhat-marketplace-45t94\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.810792 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" event={"ID":"4eac5c32-b786-4c2a-a97a-ba4400d3a3b0","Type":"ContainerDied","Data":"b296bf635d102ab8e10fa4726891fe485471e489b0d345c7dbccca3728fd7983"} Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.810835 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b296bf635d102ab8e10fa4726891fe485471e489b0d345c7dbccca3728fd7983" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.810842 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq" Sep 29 19:15:03 crc kubenswrapper[4780]: I0929 19:15:03.826583 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:04 crc kubenswrapper[4780]: I0929 19:15:04.317573 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-45t94"] Sep 29 19:15:04 crc kubenswrapper[4780]: I0929 19:15:04.821504 4780 generic.go:334] "Generic (PLEG): container finished" podID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerID="74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503" exitCode=0 Sep 29 19:15:04 crc kubenswrapper[4780]: I0929 19:15:04.821663 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45t94" event={"ID":"a5564744-c70a-4f51-a18f-86c77f435d6b","Type":"ContainerDied","Data":"74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503"} Sep 29 19:15:04 crc kubenswrapper[4780]: I0929 19:15:04.821968 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45t94" event={"ID":"a5564744-c70a-4f51-a18f-86c77f435d6b","Type":"ContainerStarted","Data":"2367812f9745c3c687b1cd23dee0ce36cb5f36500c3275e3729e2ffcf84d7f0c"} Sep 29 19:15:04 crc kubenswrapper[4780]: I0929 19:15:04.823784 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 19:15:05 crc kubenswrapper[4780]: I0929 19:15:05.835095 4780 generic.go:334] "Generic (PLEG): container finished" podID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerID="1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339" exitCode=0 Sep 29 19:15:05 crc kubenswrapper[4780]: I0929 19:15:05.835206 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45t94" event={"ID":"a5564744-c70a-4f51-a18f-86c77f435d6b","Type":"ContainerDied","Data":"1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339"} Sep 29 19:15:06 crc kubenswrapper[4780]: I0929 19:15:06.845979 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45t94" event={"ID":"a5564744-c70a-4f51-a18f-86c77f435d6b","Type":"ContainerStarted","Data":"037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4"} Sep 29 19:15:13 crc kubenswrapper[4780]: I0929 19:15:13.827191 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:13 crc kubenswrapper[4780]: I0929 19:15:13.827747 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:13 crc kubenswrapper[4780]: I0929 19:15:13.885709 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:13 crc kubenswrapper[4780]: I0929 19:15:13.916148 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-45t94" podStartSLOduration=9.473054489 podStartE2EDuration="10.91611445s" podCreationTimestamp="2025-09-29 19:15:03 +0000 UTC" firstStartedPulling="2025-09-29 19:15:04.823548106 +0000 UTC m=+1904.771846150" lastFinishedPulling="2025-09-29 19:15:06.266608057 +0000 UTC m=+1906.214906111" observedRunningTime="2025-09-29 19:15:06.880877938 +0000 UTC m=+1906.829176022" watchObservedRunningTime="2025-09-29 19:15:13.91611445 +0000 UTC m=+1913.864412494" Sep 29 19:15:13 crc kubenswrapper[4780]: I0929 19:15:13.966349 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:14 crc kubenswrapper[4780]: I0929 19:15:14.129289 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-45t94"] Sep 29 19:15:15 crc kubenswrapper[4780]: I0929 19:15:15.944877 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-45t94" podUID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerName="registry-server" containerID="cri-o://037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4" gracePeriod=2 Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.439226 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.617989 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-catalog-content\") pod \"a5564744-c70a-4f51-a18f-86c77f435d6b\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.618626 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml9wd\" (UniqueName: \"kubernetes.io/projected/a5564744-c70a-4f51-a18f-86c77f435d6b-kube-api-access-ml9wd\") pod \"a5564744-c70a-4f51-a18f-86c77f435d6b\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.618703 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-utilities\") pod \"a5564744-c70a-4f51-a18f-86c77f435d6b\" (UID: \"a5564744-c70a-4f51-a18f-86c77f435d6b\") " Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.620502 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-utilities" (OuterVolumeSpecName: "utilities") pod "a5564744-c70a-4f51-a18f-86c77f435d6b" (UID: "a5564744-c70a-4f51-a18f-86c77f435d6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.625508 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5564744-c70a-4f51-a18f-86c77f435d6b-kube-api-access-ml9wd" (OuterVolumeSpecName: "kube-api-access-ml9wd") pod "a5564744-c70a-4f51-a18f-86c77f435d6b" (UID: "a5564744-c70a-4f51-a18f-86c77f435d6b"). InnerVolumeSpecName "kube-api-access-ml9wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.632574 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5564744-c70a-4f51-a18f-86c77f435d6b" (UID: "a5564744-c70a-4f51-a18f-86c77f435d6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.720488 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.720535 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml9wd\" (UniqueName: \"kubernetes.io/projected/a5564744-c70a-4f51-a18f-86c77f435d6b-kube-api-access-ml9wd\") on node \"crc\" DevicePath \"\"" Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.720575 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5564744-c70a-4f51-a18f-86c77f435d6b-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.959884 4780 generic.go:334] "Generic (PLEG): container finished" podID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerID="037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4" exitCode=0 Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.959936 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45t94" event={"ID":"a5564744-c70a-4f51-a18f-86c77f435d6b","Type":"ContainerDied","Data":"037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4"} Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.959967 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-45t94" event={"ID":"a5564744-c70a-4f51-a18f-86c77f435d6b","Type":"ContainerDied","Data":"2367812f9745c3c687b1cd23dee0ce36cb5f36500c3275e3729e2ffcf84d7f0c"} Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.959986 4780 scope.go:117] "RemoveContainer" containerID="037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4" Sep 29 19:15:16 crc kubenswrapper[4780]: I0929 19:15:16.960194 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-45t94" Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.000041 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-45t94"] Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.004120 4780 scope.go:117] "RemoveContainer" containerID="1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339" Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.007781 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-45t94"] Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.036163 4780 scope.go:117] "RemoveContainer" containerID="74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503" Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.076746 4780 scope.go:117] "RemoveContainer" containerID="037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4" Sep 29 19:15:17 crc kubenswrapper[4780]: E0929 19:15:17.077372 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4\": container with ID starting with 037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4 not found: ID does not exist" containerID="037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4" Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.077425 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4"} err="failed to get container status \"037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4\": rpc error: code = NotFound desc = could not find container \"037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4\": container with ID starting with 037ef3bcc2c418a63729aca2f112e761bbd9fbd852665f822b264c3125fe91b4 not found: ID does not exist" Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.077461 4780 scope.go:117] "RemoveContainer" containerID="1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339" Sep 29 19:15:17 crc kubenswrapper[4780]: E0929 19:15:17.077946 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339\": container with ID starting with 1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339 not found: ID does not exist" containerID="1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339" Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.078002 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339"} err="failed to get container status \"1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339\": rpc error: code = NotFound desc = could not find container \"1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339\": container with ID starting with 1b2603ed3c960e58bef631a9bdfbc3a88445ec51052a4ea87f69f53927db0339 not found: ID does not exist" Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.078065 4780 scope.go:117] "RemoveContainer" containerID="74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503" Sep 29 19:15:17 crc kubenswrapper[4780]: E0929 19:15:17.078438 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503\": container with ID starting with 74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503 not found: ID does not exist" containerID="74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503" Sep 29 19:15:17 crc kubenswrapper[4780]: I0929 19:15:17.078489 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503"} err="failed to get container status \"74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503\": rpc error: code = NotFound desc = could not find container \"74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503\": container with ID starting with 74f4089acd3a1499a1e8900d2de955c17fd20fb1d061698ed525a6f35abca503 not found: ID does not exist" Sep 29 19:15:18 crc kubenswrapper[4780]: I0929 19:15:18.767669 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5564744-c70a-4f51-a18f-86c77f435d6b" path="/var/lib/kubelet/pods/a5564744-c70a-4f51-a18f-86c77f435d6b/volumes" Sep 29 19:15:33 crc kubenswrapper[4780]: I0929 19:15:33.223980 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:15:33 crc kubenswrapper[4780]: I0929 19:15:33.224568 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.223027 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nnppg"] Sep 29 19:15:59 crc kubenswrapper[4780]: E0929 19:15:59.224769 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerName="extract-utilities" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.224794 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerName="extract-utilities" Sep 29 19:15:59 crc kubenswrapper[4780]: E0929 19:15:59.224833 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerName="registry-server" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.224843 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerName="registry-server" Sep 29 19:15:59 crc kubenswrapper[4780]: E0929 19:15:59.224878 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerName="extract-content" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.224888 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerName="extract-content" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.239726 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5564744-c70a-4f51-a18f-86c77f435d6b" containerName="registry-server" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.243849 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.248305 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnppg"] Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.346243 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-catalog-content\") pod \"community-operators-nnppg\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.346804 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-utilities\") pod \"community-operators-nnppg\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.347034 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbbfn\" (UniqueName: \"kubernetes.io/projected/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-kube-api-access-rbbfn\") pod \"community-operators-nnppg\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.449533 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-utilities\") pod \"community-operators-nnppg\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.449645 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbbfn\" (UniqueName: \"kubernetes.io/projected/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-kube-api-access-rbbfn\") pod \"community-operators-nnppg\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.449706 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-catalog-content\") pod \"community-operators-nnppg\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.450476 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-catalog-content\") pod \"community-operators-nnppg\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.450679 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-utilities\") pod \"community-operators-nnppg\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.473928 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbbfn\" (UniqueName: \"kubernetes.io/projected/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-kube-api-access-rbbfn\") pod \"community-operators-nnppg\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:15:59 crc kubenswrapper[4780]: I0929 19:15:59.577540 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:16:00 crc kubenswrapper[4780]: I0929 19:16:00.144023 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnppg"] Sep 29 19:16:00 crc kubenswrapper[4780]: I0929 19:16:00.412918 4780 generic.go:334] "Generic (PLEG): container finished" podID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerID="c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be" exitCode=0 Sep 29 19:16:00 crc kubenswrapper[4780]: I0929 19:16:00.413012 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnppg" event={"ID":"12c715d4-49ef-4f20-88cb-36d7e8edfb4a","Type":"ContainerDied","Data":"c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be"} Sep 29 19:16:00 crc kubenswrapper[4780]: I0929 19:16:00.413367 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnppg" event={"ID":"12c715d4-49ef-4f20-88cb-36d7e8edfb4a","Type":"ContainerStarted","Data":"95a85aafafba5fafe992e39219f88e64030d16933e8c3a0f92bde5e8e9332b53"} Sep 29 19:16:02 crc kubenswrapper[4780]: I0929 19:16:02.435266 4780 generic.go:334] "Generic (PLEG): container finished" podID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerID="6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f" exitCode=0 Sep 29 19:16:02 crc kubenswrapper[4780]: I0929 19:16:02.435370 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnppg" event={"ID":"12c715d4-49ef-4f20-88cb-36d7e8edfb4a","Type":"ContainerDied","Data":"6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f"} Sep 29 19:16:03 crc kubenswrapper[4780]: I0929 19:16:03.223447 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:16:03 crc kubenswrapper[4780]: I0929 19:16:03.224250 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:16:03 crc kubenswrapper[4780]: I0929 19:16:03.451820 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnppg" event={"ID":"12c715d4-49ef-4f20-88cb-36d7e8edfb4a","Type":"ContainerStarted","Data":"d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a"} Sep 29 19:16:03 crc kubenswrapper[4780]: I0929 19:16:03.471858 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nnppg" podStartSLOduration=2.067374964 podStartE2EDuration="4.471831559s" podCreationTimestamp="2025-09-29 19:15:59 +0000 UTC" firstStartedPulling="2025-09-29 19:16:00.416685656 +0000 UTC m=+1960.364983720" lastFinishedPulling="2025-09-29 19:16:02.821142261 +0000 UTC m=+1962.769440315" observedRunningTime="2025-09-29 19:16:03.471134458 +0000 UTC m=+1963.419432512" watchObservedRunningTime="2025-09-29 19:16:03.471831559 +0000 UTC m=+1963.420129623" Sep 29 19:16:09 crc kubenswrapper[4780]: I0929 19:16:09.577797 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:16:09 crc kubenswrapper[4780]: I0929 19:16:09.578421 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:16:09 crc kubenswrapper[4780]: I0929 19:16:09.640849 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:16:10 crc kubenswrapper[4780]: I0929 19:16:10.578214 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:16:10 crc kubenswrapper[4780]: I0929 19:16:10.637586 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnppg"] Sep 29 19:16:12 crc kubenswrapper[4780]: I0929 19:16:12.556895 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nnppg" podUID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerName="registry-server" containerID="cri-o://d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a" gracePeriod=2 Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.031871 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.108144 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-catalog-content\") pod \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.108416 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbbfn\" (UniqueName: \"kubernetes.io/projected/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-kube-api-access-rbbfn\") pod \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.108476 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-utilities\") pod \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\" (UID: \"12c715d4-49ef-4f20-88cb-36d7e8edfb4a\") " Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.109735 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-utilities" (OuterVolumeSpecName: "utilities") pod "12c715d4-49ef-4f20-88cb-36d7e8edfb4a" (UID: "12c715d4-49ef-4f20-88cb-36d7e8edfb4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.115890 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-kube-api-access-rbbfn" (OuterVolumeSpecName: "kube-api-access-rbbfn") pod "12c715d4-49ef-4f20-88cb-36d7e8edfb4a" (UID: "12c715d4-49ef-4f20-88cb-36d7e8edfb4a"). InnerVolumeSpecName "kube-api-access-rbbfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.211094 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbbfn\" (UniqueName: \"kubernetes.io/projected/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-kube-api-access-rbbfn\") on node \"crc\" DevicePath \"\"" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.211146 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.284933 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12c715d4-49ef-4f20-88cb-36d7e8edfb4a" (UID: "12c715d4-49ef-4f20-88cb-36d7e8edfb4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.313725 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12c715d4-49ef-4f20-88cb-36d7e8edfb4a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.569959 4780 generic.go:334] "Generic (PLEG): container finished" podID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerID="d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a" exitCode=0 Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.570136 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnppg" event={"ID":"12c715d4-49ef-4f20-88cb-36d7e8edfb4a","Type":"ContainerDied","Data":"d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a"} Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.570246 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnppg" event={"ID":"12c715d4-49ef-4f20-88cb-36d7e8edfb4a","Type":"ContainerDied","Data":"95a85aafafba5fafe992e39219f88e64030d16933e8c3a0f92bde5e8e9332b53"} Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.570280 4780 scope.go:117] "RemoveContainer" containerID="d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.570305 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnppg" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.597869 4780 scope.go:117] "RemoveContainer" containerID="6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.632565 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnppg"] Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.639993 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nnppg"] Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.645266 4780 scope.go:117] "RemoveContainer" containerID="c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.674428 4780 scope.go:117] "RemoveContainer" containerID="d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a" Sep 29 19:16:13 crc kubenswrapper[4780]: E0929 19:16:13.677619 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a\": container with ID starting with d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a not found: ID does not exist" containerID="d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.677677 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a"} err="failed to get container status \"d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a\": rpc error: code = NotFound desc = could not find container \"d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a\": container with ID starting with d808f22ebc57769865851148721357d7ef8971967fa9ae288aae30c10b18046a not found: ID does not exist" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.677717 4780 scope.go:117] "RemoveContainer" containerID="6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f" Sep 29 19:16:13 crc kubenswrapper[4780]: E0929 19:16:13.678694 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f\": container with ID starting with 6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f not found: ID does not exist" containerID="6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.678754 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f"} err="failed to get container status \"6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f\": rpc error: code = NotFound desc = could not find container \"6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f\": container with ID starting with 6d6252ea6f743d1ea40a469fca2fc36134a7338c9ef7f929285067273566a82f not found: ID does not exist" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.678791 4780 scope.go:117] "RemoveContainer" containerID="c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be" Sep 29 19:16:13 crc kubenswrapper[4780]: E0929 19:16:13.679281 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be\": container with ID starting with c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be not found: ID does not exist" containerID="c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be" Sep 29 19:16:13 crc kubenswrapper[4780]: I0929 19:16:13.679314 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be"} err="failed to get container status \"c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be\": rpc error: code = NotFound desc = could not find container \"c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be\": container with ID starting with c63aaa9730dab6ca0865e5402f00a81ae854092433adf7c104adf6fbfb8267be not found: ID does not exist" Sep 29 19:16:14 crc kubenswrapper[4780]: I0929 19:16:14.764806 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" path="/var/lib/kubelet/pods/12c715d4-49ef-4f20-88cb-36d7e8edfb4a/volumes" Sep 29 19:16:14 crc kubenswrapper[4780]: I0929 19:16:14.920273 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g5jqq"] Sep 29 19:16:14 crc kubenswrapper[4780]: E0929 19:16:14.921226 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerName="registry-server" Sep 29 19:16:14 crc kubenswrapper[4780]: I0929 19:16:14.921249 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerName="registry-server" Sep 29 19:16:14 crc kubenswrapper[4780]: E0929 19:16:14.921259 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerName="extract-content" Sep 29 19:16:14 crc kubenswrapper[4780]: I0929 19:16:14.921267 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerName="extract-content" Sep 29 19:16:14 crc kubenswrapper[4780]: E0929 19:16:14.921302 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerName="extract-utilities" Sep 29 19:16:14 crc kubenswrapper[4780]: I0929 19:16:14.921310 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerName="extract-utilities" Sep 29 19:16:14 crc kubenswrapper[4780]: I0929 19:16:14.921508 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c715d4-49ef-4f20-88cb-36d7e8edfb4a" containerName="registry-server" Sep 29 19:16:14 crc kubenswrapper[4780]: I0929 19:16:14.922736 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:14 crc kubenswrapper[4780]: I0929 19:16:14.941027 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5jqq"] Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.040496 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-utilities\") pod \"redhat-operators-g5jqq\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.040625 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-catalog-content\") pod \"redhat-operators-g5jqq\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.040691 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwrn7\" (UniqueName: \"kubernetes.io/projected/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-kube-api-access-hwrn7\") pod \"redhat-operators-g5jqq\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.142151 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwrn7\" (UniqueName: \"kubernetes.io/projected/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-kube-api-access-hwrn7\") pod \"redhat-operators-g5jqq\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.142245 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-utilities\") pod \"redhat-operators-g5jqq\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.142306 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-catalog-content\") pod \"redhat-operators-g5jqq\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.142965 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-utilities\") pod \"redhat-operators-g5jqq\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.143083 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-catalog-content\") pod \"redhat-operators-g5jqq\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.168791 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwrn7\" (UniqueName: \"kubernetes.io/projected/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-kube-api-access-hwrn7\") pod \"redhat-operators-g5jqq\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.245787 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:15 crc kubenswrapper[4780]: I0929 19:16:15.733483 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5jqq"] Sep 29 19:16:16 crc kubenswrapper[4780]: I0929 19:16:16.604009 4780 generic.go:334] "Generic (PLEG): container finished" podID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerID="cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece" exitCode=0 Sep 29 19:16:16 crc kubenswrapper[4780]: I0929 19:16:16.604172 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5jqq" event={"ID":"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e","Type":"ContainerDied","Data":"cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece"} Sep 29 19:16:16 crc kubenswrapper[4780]: I0929 19:16:16.604259 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5jqq" event={"ID":"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e","Type":"ContainerStarted","Data":"f83929d68d4c4fe7675a9e921c86788e8fd5822c94ff775c4d69ca1f13e17278"} Sep 29 19:16:18 crc kubenswrapper[4780]: I0929 19:16:18.627251 4780 generic.go:334] "Generic (PLEG): container finished" podID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerID="c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6" exitCode=0 Sep 29 19:16:18 crc kubenswrapper[4780]: I0929 19:16:18.627367 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5jqq" event={"ID":"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e","Type":"ContainerDied","Data":"c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6"} Sep 29 19:16:19 crc kubenswrapper[4780]: I0929 19:16:19.643513 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5jqq" event={"ID":"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e","Type":"ContainerStarted","Data":"754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576"} Sep 29 19:16:19 crc kubenswrapper[4780]: I0929 19:16:19.690302 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g5jqq" podStartSLOduration=3.16558396 podStartE2EDuration="5.690262876s" podCreationTimestamp="2025-09-29 19:16:14 +0000 UTC" firstStartedPulling="2025-09-29 19:16:16.608124079 +0000 UTC m=+1976.556422113" lastFinishedPulling="2025-09-29 19:16:19.132802985 +0000 UTC m=+1979.081101029" observedRunningTime="2025-09-29 19:16:19.67974273 +0000 UTC m=+1979.628040834" watchObservedRunningTime="2025-09-29 19:16:19.690262876 +0000 UTC m=+1979.638560960" Sep 29 19:16:25 crc kubenswrapper[4780]: I0929 19:16:25.246395 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:25 crc kubenswrapper[4780]: I0929 19:16:25.246897 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:25 crc kubenswrapper[4780]: I0929 19:16:25.322000 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:25 crc kubenswrapper[4780]: I0929 19:16:25.759607 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:25 crc kubenswrapper[4780]: I0929 19:16:25.824724 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g5jqq"] Sep 29 19:16:27 crc kubenswrapper[4780]: I0929 19:16:27.728498 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g5jqq" podUID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerName="registry-server" containerID="cri-o://754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576" gracePeriod=2 Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.212646 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.297983 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwrn7\" (UniqueName: \"kubernetes.io/projected/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-kube-api-access-hwrn7\") pod \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.298376 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-utilities\") pod \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.298490 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-catalog-content\") pod \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\" (UID: \"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e\") " Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.299408 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-utilities" (OuterVolumeSpecName: "utilities") pod "b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" (UID: "b970a8d0-3f4b-424e-b3fc-b132cbe4c19e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.300202 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.309465 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-kube-api-access-hwrn7" (OuterVolumeSpecName: "kube-api-access-hwrn7") pod "b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" (UID: "b970a8d0-3f4b-424e-b3fc-b132cbe4c19e"). InnerVolumeSpecName "kube-api-access-hwrn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.401633 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwrn7\" (UniqueName: \"kubernetes.io/projected/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-kube-api-access-hwrn7\") on node \"crc\" DevicePath \"\"" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.425163 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" (UID: "b970a8d0-3f4b-424e-b3fc-b132cbe4c19e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.504145 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.748192 4780 generic.go:334] "Generic (PLEG): container finished" podID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerID="754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576" exitCode=0 Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.748245 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5jqq" event={"ID":"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e","Type":"ContainerDied","Data":"754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576"} Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.748291 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5jqq" event={"ID":"b970a8d0-3f4b-424e-b3fc-b132cbe4c19e","Type":"ContainerDied","Data":"f83929d68d4c4fe7675a9e921c86788e8fd5822c94ff775c4d69ca1f13e17278"} Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.748313 4780 scope.go:117] "RemoveContainer" containerID="754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.748897 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5jqq" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.785299 4780 scope.go:117] "RemoveContainer" containerID="c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.802120 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g5jqq"] Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.810773 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g5jqq"] Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.826536 4780 scope.go:117] "RemoveContainer" containerID="cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.851344 4780 scope.go:117] "RemoveContainer" containerID="754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576" Sep 29 19:16:28 crc kubenswrapper[4780]: E0929 19:16:28.854277 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576\": container with ID starting with 754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576 not found: ID does not exist" containerID="754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.854464 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576"} err="failed to get container status \"754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576\": rpc error: code = NotFound desc = could not find container \"754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576\": container with ID starting with 754f031f6475a371888a941a3060db26215e209c9acd651a6ae573061ec3f576 not found: ID does not exist" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.854617 4780 scope.go:117] "RemoveContainer" containerID="c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6" Sep 29 19:16:28 crc kubenswrapper[4780]: E0929 19:16:28.855317 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6\": container with ID starting with c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6 not found: ID does not exist" containerID="c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.855470 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6"} err="failed to get container status \"c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6\": rpc error: code = NotFound desc = could not find container \"c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6\": container with ID starting with c88a5659c114cb3fb655d040f84bbe31c64eed57ceaab0fa7291b686bc3ab8f6 not found: ID does not exist" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.855578 4780 scope.go:117] "RemoveContainer" containerID="cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece" Sep 29 19:16:28 crc kubenswrapper[4780]: E0929 19:16:28.856309 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece\": container with ID starting with cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece not found: ID does not exist" containerID="cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece" Sep 29 19:16:28 crc kubenswrapper[4780]: I0929 19:16:28.856418 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece"} err="failed to get container status \"cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece\": rpc error: code = NotFound desc = could not find container \"cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece\": container with ID starting with cc9847ce3ce00961a21c001c4a918670a8fbb1a3bee32a74e4ab14997dde8ece not found: ID does not exist" Sep 29 19:16:30 crc kubenswrapper[4780]: I0929 19:16:30.765171 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" path="/var/lib/kubelet/pods/b970a8d0-3f4b-424e-b3fc-b132cbe4c19e/volumes" Sep 29 19:16:33 crc kubenswrapper[4780]: I0929 19:16:33.223549 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:16:33 crc kubenswrapper[4780]: I0929 19:16:33.223655 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:16:33 crc kubenswrapper[4780]: I0929 19:16:33.223751 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:16:33 crc kubenswrapper[4780]: I0929 19:16:33.224509 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fed7f1d0dd0c8bd2742028ee8d1b345741db16a32e9dad5b20e711baddf17493"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:16:33 crc kubenswrapper[4780]: I0929 19:16:33.224574 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://fed7f1d0dd0c8bd2742028ee8d1b345741db16a32e9dad5b20e711baddf17493" gracePeriod=600 Sep 29 19:16:33 crc kubenswrapper[4780]: I0929 19:16:33.811037 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="fed7f1d0dd0c8bd2742028ee8d1b345741db16a32e9dad5b20e711baddf17493" exitCode=0 Sep 29 19:16:33 crc kubenswrapper[4780]: I0929 19:16:33.811087 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"fed7f1d0dd0c8bd2742028ee8d1b345741db16a32e9dad5b20e711baddf17493"} Sep 29 19:16:33 crc kubenswrapper[4780]: I0929 19:16:33.811621 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c"} Sep 29 19:16:33 crc kubenswrapper[4780]: I0929 19:16:33.811678 4780 scope.go:117] "RemoveContainer" containerID="82a553bf3ca84b4ad35c7bedb78bcf30c1683c7f9cc2db02ae5d7e5cb3b0bf2d" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.675854 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9c55x"] Sep 29 19:17:22 crc kubenswrapper[4780]: E0929 19:17:22.677694 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerName="extract-content" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.677725 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerName="extract-content" Sep 29 19:17:22 crc kubenswrapper[4780]: E0929 19:17:22.677777 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerName="extract-utilities" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.677791 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerName="extract-utilities" Sep 29 19:17:22 crc kubenswrapper[4780]: E0929 19:17:22.677821 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerName="registry-server" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.677835 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerName="registry-server" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.678221 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b970a8d0-3f4b-424e-b3fc-b132cbe4c19e" containerName="registry-server" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.682965 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.691944 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9c55x"] Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.863590 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-utilities\") pod \"certified-operators-9c55x\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.863681 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-catalog-content\") pod \"certified-operators-9c55x\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.863923 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5srn\" (UniqueName: \"kubernetes.io/projected/270281e8-f136-4d50-aff8-beb6e1c67ae2-kube-api-access-v5srn\") pod \"certified-operators-9c55x\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.965955 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-utilities\") pod \"certified-operators-9c55x\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.966216 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-catalog-content\") pod \"certified-operators-9c55x\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.966389 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5srn\" (UniqueName: \"kubernetes.io/projected/270281e8-f136-4d50-aff8-beb6e1c67ae2-kube-api-access-v5srn\") pod \"certified-operators-9c55x\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.967190 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-utilities\") pod \"certified-operators-9c55x\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:22 crc kubenswrapper[4780]: I0929 19:17:22.967342 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-catalog-content\") pod \"certified-operators-9c55x\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:23 crc kubenswrapper[4780]: I0929 19:17:23.007725 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5srn\" (UniqueName: \"kubernetes.io/projected/270281e8-f136-4d50-aff8-beb6e1c67ae2-kube-api-access-v5srn\") pod \"certified-operators-9c55x\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:23 crc kubenswrapper[4780]: I0929 19:17:23.022008 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:23 crc kubenswrapper[4780]: I0929 19:17:23.529399 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9c55x"] Sep 29 19:17:24 crc kubenswrapper[4780]: I0929 19:17:24.341318 4780 generic.go:334] "Generic (PLEG): container finished" podID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerID="24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1" exitCode=0 Sep 29 19:17:24 crc kubenswrapper[4780]: I0929 19:17:24.341398 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c55x" event={"ID":"270281e8-f136-4d50-aff8-beb6e1c67ae2","Type":"ContainerDied","Data":"24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1"} Sep 29 19:17:24 crc kubenswrapper[4780]: I0929 19:17:24.341469 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c55x" event={"ID":"270281e8-f136-4d50-aff8-beb6e1c67ae2","Type":"ContainerStarted","Data":"bd4fee9095adb7087992a62e1c03799c18bedfc8d4801a8dcaf321953ffc41da"} Sep 29 19:17:26 crc kubenswrapper[4780]: I0929 19:17:26.364578 4780 generic.go:334] "Generic (PLEG): container finished" podID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerID="6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453" exitCode=0 Sep 29 19:17:26 crc kubenswrapper[4780]: I0929 19:17:26.364801 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c55x" event={"ID":"270281e8-f136-4d50-aff8-beb6e1c67ae2","Type":"ContainerDied","Data":"6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453"} Sep 29 19:17:27 crc kubenswrapper[4780]: I0929 19:17:27.379040 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c55x" event={"ID":"270281e8-f136-4d50-aff8-beb6e1c67ae2","Type":"ContainerStarted","Data":"9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e"} Sep 29 19:17:27 crc kubenswrapper[4780]: I0929 19:17:27.402153 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9c55x" podStartSLOduration=2.939630836 podStartE2EDuration="5.402132611s" podCreationTimestamp="2025-09-29 19:17:22 +0000 UTC" firstStartedPulling="2025-09-29 19:17:24.344567549 +0000 UTC m=+2044.292865643" lastFinishedPulling="2025-09-29 19:17:26.807069374 +0000 UTC m=+2046.755367418" observedRunningTime="2025-09-29 19:17:27.397958703 +0000 UTC m=+2047.346256787" watchObservedRunningTime="2025-09-29 19:17:27.402132611 +0000 UTC m=+2047.350430655" Sep 29 19:17:33 crc kubenswrapper[4780]: I0929 19:17:33.023283 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:33 crc kubenswrapper[4780]: I0929 19:17:33.023961 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:33 crc kubenswrapper[4780]: I0929 19:17:33.084248 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:33 crc kubenswrapper[4780]: I0929 19:17:33.520023 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:34 crc kubenswrapper[4780]: I0929 19:17:34.255298 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9c55x"] Sep 29 19:17:35 crc kubenswrapper[4780]: I0929 19:17:35.458277 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9c55x" podUID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerName="registry-server" containerID="cri-o://9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e" gracePeriod=2 Sep 29 19:17:35 crc kubenswrapper[4780]: I0929 19:17:35.910709 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:35 crc kubenswrapper[4780]: I0929 19:17:35.989605 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-catalog-content\") pod \"270281e8-f136-4d50-aff8-beb6e1c67ae2\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " Sep 29 19:17:35 crc kubenswrapper[4780]: I0929 19:17:35.989688 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-utilities\") pod \"270281e8-f136-4d50-aff8-beb6e1c67ae2\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " Sep 29 19:17:35 crc kubenswrapper[4780]: I0929 19:17:35.989884 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5srn\" (UniqueName: \"kubernetes.io/projected/270281e8-f136-4d50-aff8-beb6e1c67ae2-kube-api-access-v5srn\") pod \"270281e8-f136-4d50-aff8-beb6e1c67ae2\" (UID: \"270281e8-f136-4d50-aff8-beb6e1c67ae2\") " Sep 29 19:17:35 crc kubenswrapper[4780]: I0929 19:17:35.990966 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-utilities" (OuterVolumeSpecName: "utilities") pod "270281e8-f136-4d50-aff8-beb6e1c67ae2" (UID: "270281e8-f136-4d50-aff8-beb6e1c67ae2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:17:35 crc kubenswrapper[4780]: I0929 19:17:35.991378 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:17:35 crc kubenswrapper[4780]: I0929 19:17:35.995897 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270281e8-f136-4d50-aff8-beb6e1c67ae2-kube-api-access-v5srn" (OuterVolumeSpecName: "kube-api-access-v5srn") pod "270281e8-f136-4d50-aff8-beb6e1c67ae2" (UID: "270281e8-f136-4d50-aff8-beb6e1c67ae2"). InnerVolumeSpecName "kube-api-access-v5srn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.092972 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5srn\" (UniqueName: \"kubernetes.io/projected/270281e8-f136-4d50-aff8-beb6e1c67ae2-kube-api-access-v5srn\") on node \"crc\" DevicePath \"\"" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.238667 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "270281e8-f136-4d50-aff8-beb6e1c67ae2" (UID: "270281e8-f136-4d50-aff8-beb6e1c67ae2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.296376 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270281e8-f136-4d50-aff8-beb6e1c67ae2-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.471857 4780 generic.go:334] "Generic (PLEG): container finished" podID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerID="9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e" exitCode=0 Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.471929 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c55x" event={"ID":"270281e8-f136-4d50-aff8-beb6e1c67ae2","Type":"ContainerDied","Data":"9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e"} Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.472001 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9c55x" event={"ID":"270281e8-f136-4d50-aff8-beb6e1c67ae2","Type":"ContainerDied","Data":"bd4fee9095adb7087992a62e1c03799c18bedfc8d4801a8dcaf321953ffc41da"} Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.472068 4780 scope.go:117] "RemoveContainer" containerID="9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.472273 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9c55x" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.505697 4780 scope.go:117] "RemoveContainer" containerID="6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.529619 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9c55x"] Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.539854 4780 scope.go:117] "RemoveContainer" containerID="24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.540749 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9c55x"] Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.567262 4780 scope.go:117] "RemoveContainer" containerID="9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e" Sep 29 19:17:36 crc kubenswrapper[4780]: E0929 19:17:36.567715 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e\": container with ID starting with 9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e not found: ID does not exist" containerID="9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.567749 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e"} err="failed to get container status \"9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e\": rpc error: code = NotFound desc = could not find container \"9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e\": container with ID starting with 9c1f4ebe797d199846c5c65318d321f4fab466e65162ad6838c0f4e824ac984e not found: ID does not exist" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.567774 4780 scope.go:117] "RemoveContainer" containerID="6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453" Sep 29 19:17:36 crc kubenswrapper[4780]: E0929 19:17:36.568003 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453\": container with ID starting with 6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453 not found: ID does not exist" containerID="6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.568037 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453"} err="failed to get container status \"6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453\": rpc error: code = NotFound desc = could not find container \"6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453\": container with ID starting with 6ff8708738bba2fdc59a8a450b31b0e303f744a1668e85c7b0e4afe9c83a8453 not found: ID does not exist" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.568070 4780 scope.go:117] "RemoveContainer" containerID="24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1" Sep 29 19:17:36 crc kubenswrapper[4780]: E0929 19:17:36.568281 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1\": container with ID starting with 24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1 not found: ID does not exist" containerID="24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.568301 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1"} err="failed to get container status \"24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1\": rpc error: code = NotFound desc = could not find container \"24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1\": container with ID starting with 24bc7c9e817c1be23e46532d627d162a681e98ac31f2f6cd81f1c02b092c24d1 not found: ID does not exist" Sep 29 19:17:36 crc kubenswrapper[4780]: I0929 19:17:36.765512 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270281e8-f136-4d50-aff8-beb6e1c67ae2" path="/var/lib/kubelet/pods/270281e8-f136-4d50-aff8-beb6e1c67ae2/volumes" Sep 29 19:18:33 crc kubenswrapper[4780]: I0929 19:18:33.223614 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:18:33 crc kubenswrapper[4780]: I0929 19:18:33.224547 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:19:03 crc kubenswrapper[4780]: I0929 19:19:03.226612 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:19:03 crc kubenswrapper[4780]: I0929 19:19:03.227469 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:19:33 crc kubenswrapper[4780]: I0929 19:19:33.223424 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:19:33 crc kubenswrapper[4780]: I0929 19:19:33.224249 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:19:33 crc kubenswrapper[4780]: I0929 19:19:33.224334 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:19:33 crc kubenswrapper[4780]: I0929 19:19:33.225259 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:19:33 crc kubenswrapper[4780]: I0929 19:19:33.225336 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" gracePeriod=600 Sep 29 19:19:33 crc kubenswrapper[4780]: E0929 19:19:33.373306 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:19:33 crc kubenswrapper[4780]: I0929 19:19:33.684856 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" exitCode=0 Sep 29 19:19:33 crc kubenswrapper[4780]: I0929 19:19:33.684914 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c"} Sep 29 19:19:33 crc kubenswrapper[4780]: I0929 19:19:33.684959 4780 scope.go:117] "RemoveContainer" containerID="fed7f1d0dd0c8bd2742028ee8d1b345741db16a32e9dad5b20e711baddf17493" Sep 29 19:19:33 crc kubenswrapper[4780]: I0929 19:19:33.685804 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:19:33 crc kubenswrapper[4780]: E0929 19:19:33.686325 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:19:47 crc kubenswrapper[4780]: I0929 19:19:47.753653 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:19:47 crc kubenswrapper[4780]: E0929 19:19:47.754809 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:20:00 crc kubenswrapper[4780]: I0929 19:20:00.761037 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:20:00 crc kubenswrapper[4780]: E0929 19:20:00.762830 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:20:13 crc kubenswrapper[4780]: I0929 19:20:13.753518 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:20:13 crc kubenswrapper[4780]: E0929 19:20:13.754790 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:20:28 crc kubenswrapper[4780]: I0929 19:20:28.753233 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:20:28 crc kubenswrapper[4780]: E0929 19:20:28.756018 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:20:39 crc kubenswrapper[4780]: I0929 19:20:39.753744 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:20:39 crc kubenswrapper[4780]: E0929 19:20:39.754858 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:20:51 crc kubenswrapper[4780]: I0929 19:20:51.753735 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:20:51 crc kubenswrapper[4780]: E0929 19:20:51.756653 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:21:03 crc kubenswrapper[4780]: I0929 19:21:03.753480 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:21:03 crc kubenswrapper[4780]: E0929 19:21:03.755021 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:21:18 crc kubenswrapper[4780]: I0929 19:21:18.754010 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:21:18 crc kubenswrapper[4780]: E0929 19:21:18.755143 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:21:31 crc kubenswrapper[4780]: I0929 19:21:31.754532 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:21:31 crc kubenswrapper[4780]: E0929 19:21:31.755835 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:21:42 crc kubenswrapper[4780]: I0929 19:21:42.754120 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:21:42 crc kubenswrapper[4780]: E0929 19:21:42.755284 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:21:56 crc kubenswrapper[4780]: I0929 19:21:56.753546 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:21:56 crc kubenswrapper[4780]: E0929 19:21:56.754665 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:22:09 crc kubenswrapper[4780]: I0929 19:22:09.753416 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:22:09 crc kubenswrapper[4780]: E0929 19:22:09.755950 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:22:24 crc kubenswrapper[4780]: I0929 19:22:24.755075 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:22:24 crc kubenswrapper[4780]: E0929 19:22:24.755976 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:22:36 crc kubenswrapper[4780]: I0929 19:22:36.753914 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:22:36 crc kubenswrapper[4780]: E0929 19:22:36.755275 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:22:49 crc kubenswrapper[4780]: I0929 19:22:49.753235 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:22:49 crc kubenswrapper[4780]: E0929 19:22:49.754419 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:23:04 crc kubenswrapper[4780]: I0929 19:23:04.753552 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:23:04 crc kubenswrapper[4780]: E0929 19:23:04.754819 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:23:17 crc kubenswrapper[4780]: I0929 19:23:17.758463 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:23:17 crc kubenswrapper[4780]: E0929 19:23:17.759963 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:23:28 crc kubenswrapper[4780]: I0929 19:23:28.755540 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:23:28 crc kubenswrapper[4780]: E0929 19:23:28.756578 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:23:39 crc kubenswrapper[4780]: I0929 19:23:39.754918 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:23:39 crc kubenswrapper[4780]: E0929 19:23:39.756158 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:23:54 crc kubenswrapper[4780]: I0929 19:23:54.754709 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:23:54 crc kubenswrapper[4780]: E0929 19:23:54.756104 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:24:08 crc kubenswrapper[4780]: I0929 19:24:08.754216 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:24:08 crc kubenswrapper[4780]: E0929 19:24:08.755335 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:24:19 crc kubenswrapper[4780]: I0929 19:24:19.753792 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:24:19 crc kubenswrapper[4780]: E0929 19:24:19.754918 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:24:34 crc kubenswrapper[4780]: I0929 19:24:34.752569 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:24:35 crc kubenswrapper[4780]: I0929 19:24:35.774492 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"13615b1fbb970015fa2317330c6ca78e99d500c62d390d5b062ba91134b17e40"} Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.640471 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8ztbw"] Sep 29 19:26:36 crc kubenswrapper[4780]: E0929 19:26:36.642875 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerName="extract-content" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.642939 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerName="extract-content" Sep 29 19:26:36 crc kubenswrapper[4780]: E0929 19:26:36.642994 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerName="registry-server" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.643014 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerName="registry-server" Sep 29 19:26:36 crc kubenswrapper[4780]: E0929 19:26:36.643032 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerName="extract-utilities" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.643076 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerName="extract-utilities" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.643861 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="270281e8-f136-4d50-aff8-beb6e1c67ae2" containerName="registry-server" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.647335 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.654456 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ztbw"] Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.794757 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5fbz\" (UniqueName: \"kubernetes.io/projected/85ab770b-a654-4058-b1e4-adf5a1ca35f7-kube-api-access-c5fbz\") pod \"community-operators-8ztbw\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.794815 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-utilities\") pod \"community-operators-8ztbw\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.794882 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-catalog-content\") pod \"community-operators-8ztbw\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.897762 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5fbz\" (UniqueName: \"kubernetes.io/projected/85ab770b-a654-4058-b1e4-adf5a1ca35f7-kube-api-access-c5fbz\") pod \"community-operators-8ztbw\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.897970 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-utilities\") pod \"community-operators-8ztbw\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.898215 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-catalog-content\") pod \"community-operators-8ztbw\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.898532 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-utilities\") pod \"community-operators-8ztbw\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.899427 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-catalog-content\") pod \"community-operators-8ztbw\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:36 crc kubenswrapper[4780]: I0929 19:26:36.919434 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5fbz\" (UniqueName: \"kubernetes.io/projected/85ab770b-a654-4058-b1e4-adf5a1ca35f7-kube-api-access-c5fbz\") pod \"community-operators-8ztbw\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:37 crc kubenswrapper[4780]: I0929 19:26:37.003819 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:37 crc kubenswrapper[4780]: I0929 19:26:37.466175 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ztbw"] Sep 29 19:26:37 crc kubenswrapper[4780]: I0929 19:26:37.971464 4780 generic.go:334] "Generic (PLEG): container finished" podID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerID="181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4" exitCode=0 Sep 29 19:26:37 crc kubenswrapper[4780]: I0929 19:26:37.971679 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ztbw" event={"ID":"85ab770b-a654-4058-b1e4-adf5a1ca35f7","Type":"ContainerDied","Data":"181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4"} Sep 29 19:26:37 crc kubenswrapper[4780]: I0929 19:26:37.971891 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ztbw" event={"ID":"85ab770b-a654-4058-b1e4-adf5a1ca35f7","Type":"ContainerStarted","Data":"c48dcffae76fb5d987d9fe44798ff7bbef01fd2371bb61aa4098d3e1ce8e1838"} Sep 29 19:26:37 crc kubenswrapper[4780]: I0929 19:26:37.975539 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 19:26:38 crc kubenswrapper[4780]: I0929 19:26:38.982528 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ztbw" event={"ID":"85ab770b-a654-4058-b1e4-adf5a1ca35f7","Type":"ContainerStarted","Data":"7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a"} Sep 29 19:26:39 crc kubenswrapper[4780]: I0929 19:26:39.996588 4780 generic.go:334] "Generic (PLEG): container finished" podID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerID="7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a" exitCode=0 Sep 29 19:26:39 crc kubenswrapper[4780]: I0929 19:26:39.996677 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ztbw" event={"ID":"85ab770b-a654-4058-b1e4-adf5a1ca35f7","Type":"ContainerDied","Data":"7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a"} Sep 29 19:26:41 crc kubenswrapper[4780]: I0929 19:26:41.008221 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ztbw" event={"ID":"85ab770b-a654-4058-b1e4-adf5a1ca35f7","Type":"ContainerStarted","Data":"dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536"} Sep 29 19:26:41 crc kubenswrapper[4780]: I0929 19:26:41.037023 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8ztbw" podStartSLOduration=2.582525764 podStartE2EDuration="5.036996993s" podCreationTimestamp="2025-09-29 19:26:36 +0000 UTC" firstStartedPulling="2025-09-29 19:26:37.975206607 +0000 UTC m=+2597.923504651" lastFinishedPulling="2025-09-29 19:26:40.429677796 +0000 UTC m=+2600.377975880" observedRunningTime="2025-09-29 19:26:41.032659189 +0000 UTC m=+2600.980957243" watchObservedRunningTime="2025-09-29 19:26:41.036996993 +0000 UTC m=+2600.985295057" Sep 29 19:26:47 crc kubenswrapper[4780]: I0929 19:26:47.004820 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:47 crc kubenswrapper[4780]: I0929 19:26:47.005466 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:47 crc kubenswrapper[4780]: I0929 19:26:47.090300 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:47 crc kubenswrapper[4780]: I0929 19:26:47.163805 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:47 crc kubenswrapper[4780]: I0929 19:26:47.336650 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ztbw"] Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.083667 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8ztbw" podUID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerName="registry-server" containerID="cri-o://dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536" gracePeriod=2 Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.606496 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.717393 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5fbz\" (UniqueName: \"kubernetes.io/projected/85ab770b-a654-4058-b1e4-adf5a1ca35f7-kube-api-access-c5fbz\") pod \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.717454 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-utilities\") pod \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.717591 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-catalog-content\") pod \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\" (UID: \"85ab770b-a654-4058-b1e4-adf5a1ca35f7\") " Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.718343 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-utilities" (OuterVolumeSpecName: "utilities") pod "85ab770b-a654-4058-b1e4-adf5a1ca35f7" (UID: "85ab770b-a654-4058-b1e4-adf5a1ca35f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.727303 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ab770b-a654-4058-b1e4-adf5a1ca35f7-kube-api-access-c5fbz" (OuterVolumeSpecName: "kube-api-access-c5fbz") pod "85ab770b-a654-4058-b1e4-adf5a1ca35f7" (UID: "85ab770b-a654-4058-b1e4-adf5a1ca35f7"). InnerVolumeSpecName "kube-api-access-c5fbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.762598 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85ab770b-a654-4058-b1e4-adf5a1ca35f7" (UID: "85ab770b-a654-4058-b1e4-adf5a1ca35f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.819966 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5fbz\" (UniqueName: \"kubernetes.io/projected/85ab770b-a654-4058-b1e4-adf5a1ca35f7-kube-api-access-c5fbz\") on node \"crc\" DevicePath \"\"" Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.820019 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:26:49 crc kubenswrapper[4780]: I0929 19:26:49.820041 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85ab770b-a654-4058-b1e4-adf5a1ca35f7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.100573 4780 generic.go:334] "Generic (PLEG): container finished" podID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerID="dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536" exitCode=0 Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.100637 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ztbw" event={"ID":"85ab770b-a654-4058-b1e4-adf5a1ca35f7","Type":"ContainerDied","Data":"dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536"} Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.100692 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ztbw" event={"ID":"85ab770b-a654-4058-b1e4-adf5a1ca35f7","Type":"ContainerDied","Data":"c48dcffae76fb5d987d9fe44798ff7bbef01fd2371bb61aa4098d3e1ce8e1838"} Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.100729 4780 scope.go:117] "RemoveContainer" containerID="dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.100766 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ztbw" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.149575 4780 scope.go:117] "RemoveContainer" containerID="7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.150575 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ztbw"] Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.163485 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8ztbw"] Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.178238 4780 scope.go:117] "RemoveContainer" containerID="181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.216649 4780 scope.go:117] "RemoveContainer" containerID="dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536" Sep 29 19:26:50 crc kubenswrapper[4780]: E0929 19:26:50.217190 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536\": container with ID starting with dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536 not found: ID does not exist" containerID="dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.217252 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536"} err="failed to get container status \"dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536\": rpc error: code = NotFound desc = could not find container \"dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536\": container with ID starting with dacb9f6d89438d80de5eb460e0f1184267a5977864e43229525c87bd9305e536 not found: ID does not exist" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.217295 4780 scope.go:117] "RemoveContainer" containerID="7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a" Sep 29 19:26:50 crc kubenswrapper[4780]: E0929 19:26:50.217622 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a\": container with ID starting with 7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a not found: ID does not exist" containerID="7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.217688 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a"} err="failed to get container status \"7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a\": rpc error: code = NotFound desc = could not find container \"7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a\": container with ID starting with 7a5ebda8f0d6a7de9c7810214131b7c6d6c8edfd578197e07528fc584690689a not found: ID does not exist" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.217715 4780 scope.go:117] "RemoveContainer" containerID="181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4" Sep 29 19:26:50 crc kubenswrapper[4780]: E0929 19:26:50.217959 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4\": container with ID starting with 181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4 not found: ID does not exist" containerID="181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.217992 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4"} err="failed to get container status \"181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4\": rpc error: code = NotFound desc = could not find container \"181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4\": container with ID starting with 181d73b198686a62743272c0dda5a1d6a0579e97ceff323ab4b6eab8b688ace4 not found: ID does not exist" Sep 29 19:26:50 crc kubenswrapper[4780]: I0929 19:26:50.763006 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" path="/var/lib/kubelet/pods/85ab770b-a654-4058-b1e4-adf5a1ca35f7/volumes" Sep 29 19:27:03 crc kubenswrapper[4780]: I0929 19:27:03.223611 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:27:03 crc kubenswrapper[4780]: I0929 19:27:03.224412 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:27:33 crc kubenswrapper[4780]: I0929 19:27:33.223913 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:27:33 crc kubenswrapper[4780]: I0929 19:27:33.224720 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.252551 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6nmr7"] Sep 29 19:27:50 crc kubenswrapper[4780]: E0929 19:27:50.253613 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerName="extract-utilities" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.253637 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerName="extract-utilities" Sep 29 19:27:50 crc kubenswrapper[4780]: E0929 19:27:50.253659 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerName="extract-content" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.253673 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerName="extract-content" Sep 29 19:27:50 crc kubenswrapper[4780]: E0929 19:27:50.253729 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerName="registry-server" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.253742 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerName="registry-server" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.254010 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ab770b-a654-4058-b1e4-adf5a1ca35f7" containerName="registry-server" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.255913 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.261925 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6mpn\" (UniqueName: \"kubernetes.io/projected/a2adbd70-5e54-48c3-89cf-efc421230861-kube-api-access-f6mpn\") pod \"certified-operators-6nmr7\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.262140 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-catalog-content\") pod \"certified-operators-6nmr7\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.262212 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-utilities\") pod \"certified-operators-6nmr7\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.273602 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6nmr7"] Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.363722 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6mpn\" (UniqueName: \"kubernetes.io/projected/a2adbd70-5e54-48c3-89cf-efc421230861-kube-api-access-f6mpn\") pod \"certified-operators-6nmr7\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.363783 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-catalog-content\") pod \"certified-operators-6nmr7\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.363807 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-utilities\") pod \"certified-operators-6nmr7\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.364430 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-utilities\") pod \"certified-operators-6nmr7\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.365324 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-catalog-content\") pod \"certified-operators-6nmr7\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.390080 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6mpn\" (UniqueName: \"kubernetes.io/projected/a2adbd70-5e54-48c3-89cf-efc421230861-kube-api-access-f6mpn\") pod \"certified-operators-6nmr7\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:50 crc kubenswrapper[4780]: I0929 19:27:50.581775 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:27:51 crc kubenswrapper[4780]: I0929 19:27:51.098129 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6nmr7"] Sep 29 19:27:51 crc kubenswrapper[4780]: I0929 19:27:51.747000 4780 generic.go:334] "Generic (PLEG): container finished" podID="a2adbd70-5e54-48c3-89cf-efc421230861" containerID="ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553" exitCode=0 Sep 29 19:27:51 crc kubenswrapper[4780]: I0929 19:27:51.747114 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nmr7" event={"ID":"a2adbd70-5e54-48c3-89cf-efc421230861","Type":"ContainerDied","Data":"ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553"} Sep 29 19:27:51 crc kubenswrapper[4780]: I0929 19:27:51.747510 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nmr7" event={"ID":"a2adbd70-5e54-48c3-89cf-efc421230861","Type":"ContainerStarted","Data":"c1ae0f6d404e468697dec02fdff992d073c2c1075c0ab08f3ea9a7c6d8e8b99a"} Sep 29 19:27:52 crc kubenswrapper[4780]: I0929 19:27:52.768550 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nmr7" event={"ID":"a2adbd70-5e54-48c3-89cf-efc421230861","Type":"ContainerStarted","Data":"cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935"} Sep 29 19:27:53 crc kubenswrapper[4780]: I0929 19:27:53.778382 4780 generic.go:334] "Generic (PLEG): container finished" podID="a2adbd70-5e54-48c3-89cf-efc421230861" containerID="cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935" exitCode=0 Sep 29 19:27:53 crc kubenswrapper[4780]: I0929 19:27:53.778432 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nmr7" event={"ID":"a2adbd70-5e54-48c3-89cf-efc421230861","Type":"ContainerDied","Data":"cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935"} Sep 29 19:27:54 crc kubenswrapper[4780]: I0929 19:27:54.789189 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nmr7" event={"ID":"a2adbd70-5e54-48c3-89cf-efc421230861","Type":"ContainerStarted","Data":"b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3"} Sep 29 19:27:54 crc kubenswrapper[4780]: I0929 19:27:54.821355 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6nmr7" podStartSLOduration=2.319683765 podStartE2EDuration="4.821328163s" podCreationTimestamp="2025-09-29 19:27:50 +0000 UTC" firstStartedPulling="2025-09-29 19:27:51.750088765 +0000 UTC m=+2671.698386849" lastFinishedPulling="2025-09-29 19:27:54.251733193 +0000 UTC m=+2674.200031247" observedRunningTime="2025-09-29 19:27:54.81318695 +0000 UTC m=+2674.761485024" watchObservedRunningTime="2025-09-29 19:27:54.821328163 +0000 UTC m=+2674.769626247" Sep 29 19:28:00 crc kubenswrapper[4780]: I0929 19:28:00.582973 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:28:00 crc kubenswrapper[4780]: I0929 19:28:00.583518 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:28:00 crc kubenswrapper[4780]: I0929 19:28:00.661987 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:28:00 crc kubenswrapper[4780]: I0929 19:28:00.989638 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:28:01 crc kubenswrapper[4780]: I0929 19:28:01.073974 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6nmr7"] Sep 29 19:28:02 crc kubenswrapper[4780]: I0929 19:28:02.873214 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6nmr7" podUID="a2adbd70-5e54-48c3-89cf-efc421230861" containerName="registry-server" containerID="cri-o://b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3" gracePeriod=2 Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.223543 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.223995 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.224077 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.224825 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13615b1fbb970015fa2317330c6ca78e99d500c62d390d5b062ba91134b17e40"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.224911 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://13615b1fbb970015fa2317330c6ca78e99d500c62d390d5b062ba91134b17e40" gracePeriod=600 Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.335963 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.393883 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-catalog-content\") pod \"a2adbd70-5e54-48c3-89cf-efc421230861\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.394017 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6mpn\" (UniqueName: \"kubernetes.io/projected/a2adbd70-5e54-48c3-89cf-efc421230861-kube-api-access-f6mpn\") pod \"a2adbd70-5e54-48c3-89cf-efc421230861\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.394163 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-utilities\") pod \"a2adbd70-5e54-48c3-89cf-efc421230861\" (UID: \"a2adbd70-5e54-48c3-89cf-efc421230861\") " Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.395101 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-utilities" (OuterVolumeSpecName: "utilities") pod "a2adbd70-5e54-48c3-89cf-efc421230861" (UID: "a2adbd70-5e54-48c3-89cf-efc421230861"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.401236 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2adbd70-5e54-48c3-89cf-efc421230861-kube-api-access-f6mpn" (OuterVolumeSpecName: "kube-api-access-f6mpn") pod "a2adbd70-5e54-48c3-89cf-efc421230861" (UID: "a2adbd70-5e54-48c3-89cf-efc421230861"). InnerVolumeSpecName "kube-api-access-f6mpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.444259 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2adbd70-5e54-48c3-89cf-efc421230861" (UID: "a2adbd70-5e54-48c3-89cf-efc421230861"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.495802 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.495870 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2adbd70-5e54-48c3-89cf-efc421230861-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.495898 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6mpn\" (UniqueName: \"kubernetes.io/projected/a2adbd70-5e54-48c3-89cf-efc421230861-kube-api-access-f6mpn\") on node \"crc\" DevicePath \"\"" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.891644 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="13615b1fbb970015fa2317330c6ca78e99d500c62d390d5b062ba91134b17e40" exitCode=0 Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.891777 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"13615b1fbb970015fa2317330c6ca78e99d500c62d390d5b062ba91134b17e40"} Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.892038 4780 scope.go:117] "RemoveContainer" containerID="3453a33b8baa193345510a6583840f1762299e661090ccc34d526d0cd17ce71c" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.895607 4780 generic.go:334] "Generic (PLEG): container finished" podID="a2adbd70-5e54-48c3-89cf-efc421230861" containerID="b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3" exitCode=0 Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.895644 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nmr7" event={"ID":"a2adbd70-5e54-48c3-89cf-efc421230861","Type":"ContainerDied","Data":"b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3"} Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.895671 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nmr7" event={"ID":"a2adbd70-5e54-48c3-89cf-efc421230861","Type":"ContainerDied","Data":"c1ae0f6d404e468697dec02fdff992d073c2c1075c0ab08f3ea9a7c6d8e8b99a"} Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.895883 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nmr7" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.926380 4780 scope.go:117] "RemoveContainer" containerID="b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.945243 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6nmr7"] Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.955156 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6nmr7"] Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.964523 4780 scope.go:117] "RemoveContainer" containerID="cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935" Sep 29 19:28:03 crc kubenswrapper[4780]: I0929 19:28:03.994650 4780 scope.go:117] "RemoveContainer" containerID="ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553" Sep 29 19:28:04 crc kubenswrapper[4780]: I0929 19:28:04.035915 4780 scope.go:117] "RemoveContainer" containerID="b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3" Sep 29 19:28:04 crc kubenswrapper[4780]: E0929 19:28:04.036342 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3\": container with ID starting with b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3 not found: ID does not exist" containerID="b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3" Sep 29 19:28:04 crc kubenswrapper[4780]: I0929 19:28:04.036387 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3"} err="failed to get container status \"b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3\": rpc error: code = NotFound desc = could not find container \"b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3\": container with ID starting with b4a6d3d72f77303cb01e6f1b28eb834211e33ada24cd3636b7f25fd3304adfc3 not found: ID does not exist" Sep 29 19:28:04 crc kubenswrapper[4780]: I0929 19:28:04.036422 4780 scope.go:117] "RemoveContainer" containerID="cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935" Sep 29 19:28:04 crc kubenswrapper[4780]: E0929 19:28:04.038732 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935\": container with ID starting with cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935 not found: ID does not exist" containerID="cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935" Sep 29 19:28:04 crc kubenswrapper[4780]: I0929 19:28:04.038799 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935"} err="failed to get container status \"cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935\": rpc error: code = NotFound desc = could not find container \"cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935\": container with ID starting with cbae1e710d4c21f95108f7cf355b43891eeca5ef8a4a36d6f9352a6662481935 not found: ID does not exist" Sep 29 19:28:04 crc kubenswrapper[4780]: I0929 19:28:04.038827 4780 scope.go:117] "RemoveContainer" containerID="ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553" Sep 29 19:28:04 crc kubenswrapper[4780]: E0929 19:28:04.039447 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553\": container with ID starting with ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553 not found: ID does not exist" containerID="ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553" Sep 29 19:28:04 crc kubenswrapper[4780]: I0929 19:28:04.039467 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553"} err="failed to get container status \"ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553\": rpc error: code = NotFound desc = could not find container \"ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553\": container with ID starting with ade0601396d01dd86606a5ee221240974e10c1e58fccc7700e0fcf42d7350553 not found: ID does not exist" Sep 29 19:28:04 crc kubenswrapper[4780]: I0929 19:28:04.769296 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2adbd70-5e54-48c3-89cf-efc421230861" path="/var/lib/kubelet/pods/a2adbd70-5e54-48c3-89cf-efc421230861/volumes" Sep 29 19:28:04 crc kubenswrapper[4780]: I0929 19:28:04.910318 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e"} Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.164249 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp"] Sep 29 19:30:00 crc kubenswrapper[4780]: E0929 19:30:00.165186 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2adbd70-5e54-48c3-89cf-efc421230861" containerName="extract-utilities" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.165207 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2adbd70-5e54-48c3-89cf-efc421230861" containerName="extract-utilities" Sep 29 19:30:00 crc kubenswrapper[4780]: E0929 19:30:00.165239 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2adbd70-5e54-48c3-89cf-efc421230861" containerName="extract-content" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.165251 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2adbd70-5e54-48c3-89cf-efc421230861" containerName="extract-content" Sep 29 19:30:00 crc kubenswrapper[4780]: E0929 19:30:00.165285 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2adbd70-5e54-48c3-89cf-efc421230861" containerName="registry-server" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.165298 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2adbd70-5e54-48c3-89cf-efc421230861" containerName="registry-server" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.165561 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2adbd70-5e54-48c3-89cf-efc421230861" containerName="registry-server" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.166403 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.168652 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.170902 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.186870 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp"] Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.322891 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xksr6\" (UniqueName: \"kubernetes.io/projected/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-kube-api-access-xksr6\") pod \"collect-profiles-29319570-d9clp\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.323278 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-config-volume\") pod \"collect-profiles-29319570-d9clp\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.323544 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-secret-volume\") pod \"collect-profiles-29319570-d9clp\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.425689 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-secret-volume\") pod \"collect-profiles-29319570-d9clp\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.425789 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xksr6\" (UniqueName: \"kubernetes.io/projected/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-kube-api-access-xksr6\") pod \"collect-profiles-29319570-d9clp\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.425936 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-config-volume\") pod \"collect-profiles-29319570-d9clp\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.427495 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-config-volume\") pod \"collect-profiles-29319570-d9clp\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.438926 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-secret-volume\") pod \"collect-profiles-29319570-d9clp\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.447661 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xksr6\" (UniqueName: \"kubernetes.io/projected/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-kube-api-access-xksr6\") pod \"collect-profiles-29319570-d9clp\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.493430 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.977318 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp"] Sep 29 19:30:00 crc kubenswrapper[4780]: I0929 19:30:00.996607 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" event={"ID":"efef81a5-564e-46ca-b4ed-6bb53ffa4c23","Type":"ContainerStarted","Data":"7667166e6b9ae720a5f28a7db51202d663170ac0758d4d48026f07e5300469d9"} Sep 29 19:30:02 crc kubenswrapper[4780]: I0929 19:30:02.005288 4780 generic.go:334] "Generic (PLEG): container finished" podID="efef81a5-564e-46ca-b4ed-6bb53ffa4c23" containerID="42d2da8bd993b5df9a17139b927edc1c108a66a8827e246fcc3495b111a7323d" exitCode=0 Sep 29 19:30:02 crc kubenswrapper[4780]: I0929 19:30:02.005329 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" event={"ID":"efef81a5-564e-46ca-b4ed-6bb53ffa4c23","Type":"ContainerDied","Data":"42d2da8bd993b5df9a17139b927edc1c108a66a8827e246fcc3495b111a7323d"} Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.359571 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.469162 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xksr6\" (UniqueName: \"kubernetes.io/projected/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-kube-api-access-xksr6\") pod \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.469272 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-config-volume\") pod \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.469426 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-secret-volume\") pod \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\" (UID: \"efef81a5-564e-46ca-b4ed-6bb53ffa4c23\") " Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.470557 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-config-volume" (OuterVolumeSpecName: "config-volume") pod "efef81a5-564e-46ca-b4ed-6bb53ffa4c23" (UID: "efef81a5-564e-46ca-b4ed-6bb53ffa4c23"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.475540 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-kube-api-access-xksr6" (OuterVolumeSpecName: "kube-api-access-xksr6") pod "efef81a5-564e-46ca-b4ed-6bb53ffa4c23" (UID: "efef81a5-564e-46ca-b4ed-6bb53ffa4c23"). InnerVolumeSpecName "kube-api-access-xksr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.476115 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "efef81a5-564e-46ca-b4ed-6bb53ffa4c23" (UID: "efef81a5-564e-46ca-b4ed-6bb53ffa4c23"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.571338 4780 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.571396 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xksr6\" (UniqueName: \"kubernetes.io/projected/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-kube-api-access-xksr6\") on node \"crc\" DevicePath \"\"" Sep 29 19:30:03 crc kubenswrapper[4780]: I0929 19:30:03.571414 4780 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efef81a5-564e-46ca-b4ed-6bb53ffa4c23-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 19:30:04 crc kubenswrapper[4780]: I0929 19:30:04.024934 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" event={"ID":"efef81a5-564e-46ca-b4ed-6bb53ffa4c23","Type":"ContainerDied","Data":"7667166e6b9ae720a5f28a7db51202d663170ac0758d4d48026f07e5300469d9"} Sep 29 19:30:04 crc kubenswrapper[4780]: I0929 19:30:04.024992 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7667166e6b9ae720a5f28a7db51202d663170ac0758d4d48026f07e5300469d9" Sep 29 19:30:04 crc kubenswrapper[4780]: I0929 19:30:04.025457 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp" Sep 29 19:30:04 crc kubenswrapper[4780]: I0929 19:30:04.439918 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq"] Sep 29 19:30:04 crc kubenswrapper[4780]: I0929 19:30:04.446662 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319525-pq9tq"] Sep 29 19:30:04 crc kubenswrapper[4780]: I0929 19:30:04.774311 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482b8d66-a8d0-4d21-ba06-6f818f092ea7" path="/var/lib/kubelet/pods/482b8d66-a8d0-4d21-ba06-6f818f092ea7/volumes" Sep 29 19:30:26 crc kubenswrapper[4780]: I0929 19:30:26.196677 4780 scope.go:117] "RemoveContainer" containerID="f69bc3d24231e94ded3d05f44bd33eefd8d624c8cb5bc2dd5ae850bdcad2b12e" Sep 29 19:30:33 crc kubenswrapper[4780]: I0929 19:30:33.222994 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:30:33 crc kubenswrapper[4780]: I0929 19:30:33.223749 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:31:03 crc kubenswrapper[4780]: I0929 19:31:03.223431 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:31:03 crc kubenswrapper[4780]: I0929 19:31:03.224115 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:31:33 crc kubenswrapper[4780]: I0929 19:31:33.223314 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:31:33 crc kubenswrapper[4780]: I0929 19:31:33.223909 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:31:33 crc kubenswrapper[4780]: I0929 19:31:33.223974 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:31:33 crc kubenswrapper[4780]: I0929 19:31:33.224838 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:31:33 crc kubenswrapper[4780]: I0929 19:31:33.224937 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" gracePeriod=600 Sep 29 19:31:33 crc kubenswrapper[4780]: E0929 19:31:33.359000 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:31:33 crc kubenswrapper[4780]: I0929 19:31:33.881257 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" exitCode=0 Sep 29 19:31:33 crc kubenswrapper[4780]: I0929 19:31:33.881321 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e"} Sep 29 19:31:33 crc kubenswrapper[4780]: I0929 19:31:33.881368 4780 scope.go:117] "RemoveContainer" containerID="13615b1fbb970015fa2317330c6ca78e99d500c62d390d5b062ba91134b17e40" Sep 29 19:31:33 crc kubenswrapper[4780]: I0929 19:31:33.882263 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:31:33 crc kubenswrapper[4780]: E0929 19:31:33.882779 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:31:47 crc kubenswrapper[4780]: I0929 19:31:47.753263 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:31:47 crc kubenswrapper[4780]: E0929 19:31:47.754325 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:31:58 crc kubenswrapper[4780]: I0929 19:31:58.753269 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:31:58 crc kubenswrapper[4780]: E0929 19:31:58.754443 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:32:13 crc kubenswrapper[4780]: I0929 19:32:13.752745 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:32:13 crc kubenswrapper[4780]: E0929 19:32:13.753711 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:32:27 crc kubenswrapper[4780]: I0929 19:32:27.753123 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:32:27 crc kubenswrapper[4780]: E0929 19:32:27.754085 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:32:41 crc kubenswrapper[4780]: I0929 19:32:41.753584 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:32:41 crc kubenswrapper[4780]: E0929 19:32:41.754793 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:32:53 crc kubenswrapper[4780]: I0929 19:32:53.753659 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:32:53 crc kubenswrapper[4780]: E0929 19:32:53.754750 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:33:07 crc kubenswrapper[4780]: I0929 19:33:07.753329 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:33:07 crc kubenswrapper[4780]: E0929 19:33:07.754429 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.144967 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rgkf4"] Sep 29 19:33:13 crc kubenswrapper[4780]: E0929 19:33:13.146207 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efef81a5-564e-46ca-b4ed-6bb53ffa4c23" containerName="collect-profiles" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.146231 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="efef81a5-564e-46ca-b4ed-6bb53ffa4c23" containerName="collect-profiles" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.146514 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="efef81a5-564e-46ca-b4ed-6bb53ffa4c23" containerName="collect-profiles" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.148281 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.162292 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgkf4"] Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.284539 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-utilities\") pod \"redhat-marketplace-rgkf4\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.284626 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-catalog-content\") pod \"redhat-marketplace-rgkf4\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.285210 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfsjg\" (UniqueName: \"kubernetes.io/projected/24e234ce-e4d1-4884-9a7b-1baf38678dc3-kube-api-access-dfsjg\") pod \"redhat-marketplace-rgkf4\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.386756 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-utilities\") pod \"redhat-marketplace-rgkf4\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.386817 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-catalog-content\") pod \"redhat-marketplace-rgkf4\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.386875 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfsjg\" (UniqueName: \"kubernetes.io/projected/24e234ce-e4d1-4884-9a7b-1baf38678dc3-kube-api-access-dfsjg\") pod \"redhat-marketplace-rgkf4\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.387629 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-utilities\") pod \"redhat-marketplace-rgkf4\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.387868 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-catalog-content\") pod \"redhat-marketplace-rgkf4\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.420169 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfsjg\" (UniqueName: \"kubernetes.io/projected/24e234ce-e4d1-4884-9a7b-1baf38678dc3-kube-api-access-dfsjg\") pod \"redhat-marketplace-rgkf4\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.487592 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.546447 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2hfkd"] Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.548942 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.552989 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hfkd"] Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.691098 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-catalog-content\") pod \"redhat-operators-2hfkd\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.691641 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr8q4\" (UniqueName: \"kubernetes.io/projected/54aa5840-685f-4622-a07a-39cd44c866d6-kube-api-access-kr8q4\") pod \"redhat-operators-2hfkd\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.691704 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-utilities\") pod \"redhat-operators-2hfkd\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.792320 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-catalog-content\") pod \"redhat-operators-2hfkd\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.792371 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr8q4\" (UniqueName: \"kubernetes.io/projected/54aa5840-685f-4622-a07a-39cd44c866d6-kube-api-access-kr8q4\") pod \"redhat-operators-2hfkd\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.792409 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-utilities\") pod \"redhat-operators-2hfkd\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.792803 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-catalog-content\") pod \"redhat-operators-2hfkd\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.793203 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-utilities\") pod \"redhat-operators-2hfkd\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.817929 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr8q4\" (UniqueName: \"kubernetes.io/projected/54aa5840-685f-4622-a07a-39cd44c866d6-kube-api-access-kr8q4\") pod \"redhat-operators-2hfkd\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:13 crc kubenswrapper[4780]: I0929 19:33:13.886942 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:14 crc kubenswrapper[4780]: I0929 19:33:14.034856 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgkf4"] Sep 29 19:33:14 crc kubenswrapper[4780]: I0929 19:33:14.322917 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hfkd"] Sep 29 19:33:14 crc kubenswrapper[4780]: W0929 19:33:14.330406 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54aa5840_685f_4622_a07a_39cd44c866d6.slice/crio-f4f31f1cf773fb34895c01ef89681c4667bfdcf98a1c4605295062af09ef917e WatchSource:0}: Error finding container f4f31f1cf773fb34895c01ef89681c4667bfdcf98a1c4605295062af09ef917e: Status 404 returned error can't find the container with id f4f31f1cf773fb34895c01ef89681c4667bfdcf98a1c4605295062af09ef917e Sep 29 19:33:14 crc kubenswrapper[4780]: I0929 19:33:14.842615 4780 generic.go:334] "Generic (PLEG): container finished" podID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerID="476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17" exitCode=0 Sep 29 19:33:14 crc kubenswrapper[4780]: I0929 19:33:14.842759 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgkf4" event={"ID":"24e234ce-e4d1-4884-9a7b-1baf38678dc3","Type":"ContainerDied","Data":"476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17"} Sep 29 19:33:14 crc kubenswrapper[4780]: I0929 19:33:14.843377 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgkf4" event={"ID":"24e234ce-e4d1-4884-9a7b-1baf38678dc3","Type":"ContainerStarted","Data":"c14910184f8e232bf14392410af16eac86bd2e9d42f422697c2e5770d7a89999"} Sep 29 19:33:14 crc kubenswrapper[4780]: I0929 19:33:14.844747 4780 generic.go:334] "Generic (PLEG): container finished" podID="54aa5840-685f-4622-a07a-39cd44c866d6" containerID="99dfe60a6bbdd9e75858da302894a5a5dff227ff7ccf0e0973d8aea4fbd923c7" exitCode=0 Sep 29 19:33:14 crc kubenswrapper[4780]: I0929 19:33:14.844981 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hfkd" event={"ID":"54aa5840-685f-4622-a07a-39cd44c866d6","Type":"ContainerDied","Data":"99dfe60a6bbdd9e75858da302894a5a5dff227ff7ccf0e0973d8aea4fbd923c7"} Sep 29 19:33:14 crc kubenswrapper[4780]: I0929 19:33:14.845908 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hfkd" event={"ID":"54aa5840-685f-4622-a07a-39cd44c866d6","Type":"ContainerStarted","Data":"f4f31f1cf773fb34895c01ef89681c4667bfdcf98a1c4605295062af09ef917e"} Sep 29 19:33:14 crc kubenswrapper[4780]: I0929 19:33:14.846313 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 19:33:16 crc kubenswrapper[4780]: I0929 19:33:16.871598 4780 generic.go:334] "Generic (PLEG): container finished" podID="54aa5840-685f-4622-a07a-39cd44c866d6" containerID="4218efc6d779154150d43209213cc4fae67259f66640b2d05a9a74ed9c73c009" exitCode=0 Sep 29 19:33:16 crc kubenswrapper[4780]: I0929 19:33:16.872228 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hfkd" event={"ID":"54aa5840-685f-4622-a07a-39cd44c866d6","Type":"ContainerDied","Data":"4218efc6d779154150d43209213cc4fae67259f66640b2d05a9a74ed9c73c009"} Sep 29 19:33:16 crc kubenswrapper[4780]: I0929 19:33:16.877694 4780 generic.go:334] "Generic (PLEG): container finished" podID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerID="81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342" exitCode=0 Sep 29 19:33:16 crc kubenswrapper[4780]: I0929 19:33:16.877757 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgkf4" event={"ID":"24e234ce-e4d1-4884-9a7b-1baf38678dc3","Type":"ContainerDied","Data":"81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342"} Sep 29 19:33:18 crc kubenswrapper[4780]: I0929 19:33:18.906998 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hfkd" event={"ID":"54aa5840-685f-4622-a07a-39cd44c866d6","Type":"ContainerStarted","Data":"13c418f93864fa303159037bb3bb244102de5298dc722a49ba2b10579cc0eede"} Sep 29 19:33:18 crc kubenswrapper[4780]: I0929 19:33:18.911737 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgkf4" event={"ID":"24e234ce-e4d1-4884-9a7b-1baf38678dc3","Type":"ContainerStarted","Data":"c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8"} Sep 29 19:33:18 crc kubenswrapper[4780]: I0929 19:33:18.942594 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2hfkd" podStartSLOduration=2.804120479 podStartE2EDuration="5.94257661s" podCreationTimestamp="2025-09-29 19:33:13 +0000 UTC" firstStartedPulling="2025-09-29 19:33:14.846173166 +0000 UTC m=+2994.794471210" lastFinishedPulling="2025-09-29 19:33:17.984629257 +0000 UTC m=+2997.932927341" observedRunningTime="2025-09-29 19:33:18.94151175 +0000 UTC m=+2998.889809834" watchObservedRunningTime="2025-09-29 19:33:18.94257661 +0000 UTC m=+2998.890874664" Sep 29 19:33:18 crc kubenswrapper[4780]: I0929 19:33:18.970265 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rgkf4" podStartSLOduration=2.863295769 podStartE2EDuration="5.97024384s" podCreationTimestamp="2025-09-29 19:33:13 +0000 UTC" firstStartedPulling="2025-09-29 19:33:14.84559509 +0000 UTC m=+2994.793893134" lastFinishedPulling="2025-09-29 19:33:17.952543131 +0000 UTC m=+2997.900841205" observedRunningTime="2025-09-29 19:33:18.964525926 +0000 UTC m=+2998.912823990" watchObservedRunningTime="2025-09-29 19:33:18.97024384 +0000 UTC m=+2998.918541894" Sep 29 19:33:19 crc kubenswrapper[4780]: I0929 19:33:19.753144 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:33:19 crc kubenswrapper[4780]: E0929 19:33:19.753452 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:33:23 crc kubenswrapper[4780]: I0929 19:33:23.488504 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:23 crc kubenswrapper[4780]: I0929 19:33:23.488896 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:23 crc kubenswrapper[4780]: I0929 19:33:23.566846 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:23 crc kubenswrapper[4780]: I0929 19:33:23.887824 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:23 crc kubenswrapper[4780]: I0929 19:33:23.887916 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:23 crc kubenswrapper[4780]: I0929 19:33:23.936191 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:24 crc kubenswrapper[4780]: I0929 19:33:24.010675 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:24 crc kubenswrapper[4780]: I0929 19:33:24.016475 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:25 crc kubenswrapper[4780]: I0929 19:33:25.730095 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgkf4"] Sep 29 19:33:25 crc kubenswrapper[4780]: I0929 19:33:25.973069 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rgkf4" podUID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerName="registry-server" containerID="cri-o://c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8" gracePeriod=2 Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.408782 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.523687 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hfkd"] Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.523889 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2hfkd" podUID="54aa5840-685f-4622-a07a-39cd44c866d6" containerName="registry-server" containerID="cri-o://13c418f93864fa303159037bb3bb244102de5298dc722a49ba2b10579cc0eede" gracePeriod=2 Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.604531 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-utilities\") pod \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.604629 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-catalog-content\") pod \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.604675 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfsjg\" (UniqueName: \"kubernetes.io/projected/24e234ce-e4d1-4884-9a7b-1baf38678dc3-kube-api-access-dfsjg\") pod \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\" (UID: \"24e234ce-e4d1-4884-9a7b-1baf38678dc3\") " Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.605361 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-utilities" (OuterVolumeSpecName: "utilities") pod "24e234ce-e4d1-4884-9a7b-1baf38678dc3" (UID: "24e234ce-e4d1-4884-9a7b-1baf38678dc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.609895 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e234ce-e4d1-4884-9a7b-1baf38678dc3-kube-api-access-dfsjg" (OuterVolumeSpecName: "kube-api-access-dfsjg") pod "24e234ce-e4d1-4884-9a7b-1baf38678dc3" (UID: "24e234ce-e4d1-4884-9a7b-1baf38678dc3"). InnerVolumeSpecName "kube-api-access-dfsjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.624511 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24e234ce-e4d1-4884-9a7b-1baf38678dc3" (UID: "24e234ce-e4d1-4884-9a7b-1baf38678dc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.706058 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.706095 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24e234ce-e4d1-4884-9a7b-1baf38678dc3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.706107 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfsjg\" (UniqueName: \"kubernetes.io/projected/24e234ce-e4d1-4884-9a7b-1baf38678dc3-kube-api-access-dfsjg\") on node \"crc\" DevicePath \"\"" Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.983108 4780 generic.go:334] "Generic (PLEG): container finished" podID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerID="c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8" exitCode=0 Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.983166 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgkf4" event={"ID":"24e234ce-e4d1-4884-9a7b-1baf38678dc3","Type":"ContainerDied","Data":"c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8"} Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.983211 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgkf4" Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.983242 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgkf4" event={"ID":"24e234ce-e4d1-4884-9a7b-1baf38678dc3","Type":"ContainerDied","Data":"c14910184f8e232bf14392410af16eac86bd2e9d42f422697c2e5770d7a89999"} Sep 29 19:33:26 crc kubenswrapper[4780]: I0929 19:33:26.983265 4780 scope.go:117] "RemoveContainer" containerID="c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8" Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.012377 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgkf4"] Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.012722 4780 scope.go:117] "RemoveContainer" containerID="81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342" Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.039103 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgkf4"] Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.056850 4780 scope.go:117] "RemoveContainer" containerID="476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17" Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.083352 4780 scope.go:117] "RemoveContainer" containerID="c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8" Sep 29 19:33:27 crc kubenswrapper[4780]: E0929 19:33:27.083980 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8\": container with ID starting with c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8 not found: ID does not exist" containerID="c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8" Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.084071 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8"} err="failed to get container status \"c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8\": rpc error: code = NotFound desc = could not find container \"c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8\": container with ID starting with c642cb6eaeafcf4db5a2e981426bea1a285f3227f7e7eada7047412a6b2f1ab8 not found: ID does not exist" Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.084117 4780 scope.go:117] "RemoveContainer" containerID="81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342" Sep 29 19:33:27 crc kubenswrapper[4780]: E0929 19:33:27.084568 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342\": container with ID starting with 81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342 not found: ID does not exist" containerID="81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342" Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.084608 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342"} err="failed to get container status \"81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342\": rpc error: code = NotFound desc = could not find container \"81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342\": container with ID starting with 81c8e213da77ec7852b5cededa407a960b28a33e33e80e3ea0f04e709e484342 not found: ID does not exist" Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.084635 4780 scope.go:117] "RemoveContainer" containerID="476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17" Sep 29 19:33:27 crc kubenswrapper[4780]: E0929 19:33:27.084923 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17\": container with ID starting with 476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17 not found: ID does not exist" containerID="476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17" Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.084972 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17"} err="failed to get container status \"476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17\": rpc error: code = NotFound desc = could not find container \"476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17\": container with ID starting with 476bd9368e93e847b0106f19672876faf76284f478f839528509be6a24236f17 not found: ID does not exist" Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.997238 4780 generic.go:334] "Generic (PLEG): container finished" podID="54aa5840-685f-4622-a07a-39cd44c866d6" containerID="13c418f93864fa303159037bb3bb244102de5298dc722a49ba2b10579cc0eede" exitCode=0 Sep 29 19:33:27 crc kubenswrapper[4780]: I0929 19:33:27.997337 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hfkd" event={"ID":"54aa5840-685f-4622-a07a-39cd44c866d6","Type":"ContainerDied","Data":"13c418f93864fa303159037bb3bb244102de5298dc722a49ba2b10579cc0eede"} Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.122826 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.233073 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-catalog-content\") pod \"54aa5840-685f-4622-a07a-39cd44c866d6\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.233131 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-utilities\") pod \"54aa5840-685f-4622-a07a-39cd44c866d6\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.233186 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr8q4\" (UniqueName: \"kubernetes.io/projected/54aa5840-685f-4622-a07a-39cd44c866d6-kube-api-access-kr8q4\") pod \"54aa5840-685f-4622-a07a-39cd44c866d6\" (UID: \"54aa5840-685f-4622-a07a-39cd44c866d6\") " Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.234765 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-utilities" (OuterVolumeSpecName: "utilities") pod "54aa5840-685f-4622-a07a-39cd44c866d6" (UID: "54aa5840-685f-4622-a07a-39cd44c866d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.243389 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54aa5840-685f-4622-a07a-39cd44c866d6-kube-api-access-kr8q4" (OuterVolumeSpecName: "kube-api-access-kr8q4") pod "54aa5840-685f-4622-a07a-39cd44c866d6" (UID: "54aa5840-685f-4622-a07a-39cd44c866d6"). InnerVolumeSpecName "kube-api-access-kr8q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.335755 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.335810 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr8q4\" (UniqueName: \"kubernetes.io/projected/54aa5840-685f-4622-a07a-39cd44c866d6-kube-api-access-kr8q4\") on node \"crc\" DevicePath \"\"" Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.356210 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54aa5840-685f-4622-a07a-39cd44c866d6" (UID: "54aa5840-685f-4622-a07a-39cd44c866d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.438118 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54aa5840-685f-4622-a07a-39cd44c866d6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:33:28 crc kubenswrapper[4780]: I0929 19:33:28.768590 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" path="/var/lib/kubelet/pods/24e234ce-e4d1-4884-9a7b-1baf38678dc3/volumes" Sep 29 19:33:29 crc kubenswrapper[4780]: I0929 19:33:29.023276 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hfkd" event={"ID":"54aa5840-685f-4622-a07a-39cd44c866d6","Type":"ContainerDied","Data":"f4f31f1cf773fb34895c01ef89681c4667bfdcf98a1c4605295062af09ef917e"} Sep 29 19:33:29 crc kubenswrapper[4780]: I0929 19:33:29.023363 4780 scope.go:117] "RemoveContainer" containerID="13c418f93864fa303159037bb3bb244102de5298dc722a49ba2b10579cc0eede" Sep 29 19:33:29 crc kubenswrapper[4780]: I0929 19:33:29.023359 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hfkd" Sep 29 19:33:29 crc kubenswrapper[4780]: I0929 19:33:29.053120 4780 scope.go:117] "RemoveContainer" containerID="4218efc6d779154150d43209213cc4fae67259f66640b2d05a9a74ed9c73c009" Sep 29 19:33:29 crc kubenswrapper[4780]: I0929 19:33:29.055438 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hfkd"] Sep 29 19:33:29 crc kubenswrapper[4780]: I0929 19:33:29.064397 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2hfkd"] Sep 29 19:33:29 crc kubenswrapper[4780]: I0929 19:33:29.075108 4780 scope.go:117] "RemoveContainer" containerID="99dfe60a6bbdd9e75858da302894a5a5dff227ff7ccf0e0973d8aea4fbd923c7" Sep 29 19:33:30 crc kubenswrapper[4780]: I0929 19:33:30.768670 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54aa5840-685f-4622-a07a-39cd44c866d6" path="/var/lib/kubelet/pods/54aa5840-685f-4622-a07a-39cd44c866d6/volumes" Sep 29 19:33:32 crc kubenswrapper[4780]: I0929 19:33:32.754090 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:33:32 crc kubenswrapper[4780]: E0929 19:33:32.755121 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:33:44 crc kubenswrapper[4780]: I0929 19:33:44.753881 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:33:44 crc kubenswrapper[4780]: E0929 19:33:44.754793 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:33:55 crc kubenswrapper[4780]: I0929 19:33:55.753656 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:33:55 crc kubenswrapper[4780]: E0929 19:33:55.755014 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:34:06 crc kubenswrapper[4780]: I0929 19:34:06.754434 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:34:06 crc kubenswrapper[4780]: E0929 19:34:06.755691 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:34:21 crc kubenswrapper[4780]: I0929 19:34:21.753300 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:34:21 crc kubenswrapper[4780]: E0929 19:34:21.754318 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:34:35 crc kubenswrapper[4780]: I0929 19:34:35.753546 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:34:35 crc kubenswrapper[4780]: E0929 19:34:35.754712 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:34:48 crc kubenswrapper[4780]: I0929 19:34:48.753175 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:34:48 crc kubenswrapper[4780]: E0929 19:34:48.754263 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:35:02 crc kubenswrapper[4780]: I0929 19:35:02.754425 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:35:02 crc kubenswrapper[4780]: E0929 19:35:02.755394 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:35:14 crc kubenswrapper[4780]: I0929 19:35:14.753459 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:35:14 crc kubenswrapper[4780]: E0929 19:35:14.754959 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:35:27 crc kubenswrapper[4780]: I0929 19:35:27.753315 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:35:27 crc kubenswrapper[4780]: E0929 19:35:27.755887 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:35:38 crc kubenswrapper[4780]: I0929 19:35:38.753487 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:35:38 crc kubenswrapper[4780]: E0929 19:35:38.755229 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:35:49 crc kubenswrapper[4780]: I0929 19:35:49.753037 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:35:49 crc kubenswrapper[4780]: E0929 19:35:49.753761 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:36:04 crc kubenswrapper[4780]: I0929 19:36:04.754131 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:36:04 crc kubenswrapper[4780]: E0929 19:36:04.754957 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:36:15 crc kubenswrapper[4780]: I0929 19:36:15.753237 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:36:15 crc kubenswrapper[4780]: E0929 19:36:15.754490 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:36:27 crc kubenswrapper[4780]: I0929 19:36:27.753318 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:36:27 crc kubenswrapper[4780]: E0929 19:36:27.754661 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:36:41 crc kubenswrapper[4780]: I0929 19:36:41.753660 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:36:42 crc kubenswrapper[4780]: I0929 19:36:42.954484 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"5c59d71e8033c2ec020a0ce0431260f0e9ec2443ef890afde743461c1490c312"} Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.185686 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jjhgj"] Sep 29 19:38:10 crc kubenswrapper[4780]: E0929 19:38:10.186790 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54aa5840-685f-4622-a07a-39cd44c866d6" containerName="registry-server" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.186810 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="54aa5840-685f-4622-a07a-39cd44c866d6" containerName="registry-server" Sep 29 19:38:10 crc kubenswrapper[4780]: E0929 19:38:10.186836 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerName="registry-server" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.186849 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerName="registry-server" Sep 29 19:38:10 crc kubenswrapper[4780]: E0929 19:38:10.186882 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerName="extract-content" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.186895 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerName="extract-content" Sep 29 19:38:10 crc kubenswrapper[4780]: E0929 19:38:10.186916 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerName="extract-utilities" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.186928 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerName="extract-utilities" Sep 29 19:38:10 crc kubenswrapper[4780]: E0929 19:38:10.186951 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54aa5840-685f-4622-a07a-39cd44c866d6" containerName="extract-utilities" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.186966 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="54aa5840-685f-4622-a07a-39cd44c866d6" containerName="extract-utilities" Sep 29 19:38:10 crc kubenswrapper[4780]: E0929 19:38:10.187026 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54aa5840-685f-4622-a07a-39cd44c866d6" containerName="extract-content" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.187043 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="54aa5840-685f-4622-a07a-39cd44c866d6" containerName="extract-content" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.187325 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e234ce-e4d1-4884-9a7b-1baf38678dc3" containerName="registry-server" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.187356 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="54aa5840-685f-4622-a07a-39cd44c866d6" containerName="registry-server" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.189125 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.208957 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jjhgj"] Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.353675 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-utilities\") pod \"community-operators-jjhgj\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.353908 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-catalog-content\") pod \"community-operators-jjhgj\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.354007 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb2px\" (UniqueName: \"kubernetes.io/projected/2b4d624b-a733-46fa-b7ea-e3f311882f4b-kube-api-access-nb2px\") pod \"community-operators-jjhgj\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.377209 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-flrv9"] Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.379787 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.391918 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-flrv9"] Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.455429 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb2px\" (UniqueName: \"kubernetes.io/projected/2b4d624b-a733-46fa-b7ea-e3f311882f4b-kube-api-access-nb2px\") pod \"community-operators-jjhgj\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.455487 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm5tq\" (UniqueName: \"kubernetes.io/projected/90746ff1-95b0-4874-a644-ceac428c45c9-kube-api-access-vm5tq\") pod \"certified-operators-flrv9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.455519 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-catalog-content\") pod \"certified-operators-flrv9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.455721 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-utilities\") pod \"certified-operators-flrv9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.455895 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-utilities\") pod \"community-operators-jjhgj\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.456068 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-catalog-content\") pod \"community-operators-jjhgj\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.456519 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-utilities\") pod \"community-operators-jjhgj\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.456590 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-catalog-content\") pod \"community-operators-jjhgj\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.488511 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb2px\" (UniqueName: \"kubernetes.io/projected/2b4d624b-a733-46fa-b7ea-e3f311882f4b-kube-api-access-nb2px\") pod \"community-operators-jjhgj\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.525310 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.557706 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm5tq\" (UniqueName: \"kubernetes.io/projected/90746ff1-95b0-4874-a644-ceac428c45c9-kube-api-access-vm5tq\") pod \"certified-operators-flrv9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.557756 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-catalog-content\") pod \"certified-operators-flrv9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.557804 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-utilities\") pod \"certified-operators-flrv9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.558276 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-catalog-content\") pod \"certified-operators-flrv9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.558333 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-utilities\") pod \"certified-operators-flrv9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.578062 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm5tq\" (UniqueName: \"kubernetes.io/projected/90746ff1-95b0-4874-a644-ceac428c45c9-kube-api-access-vm5tq\") pod \"certified-operators-flrv9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:10 crc kubenswrapper[4780]: I0929 19:38:10.703265 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:11 crc kubenswrapper[4780]: I0929 19:38:11.067198 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jjhgj"] Sep 29 19:38:11 crc kubenswrapper[4780]: I0929 19:38:11.171112 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-flrv9"] Sep 29 19:38:11 crc kubenswrapper[4780]: W0929 19:38:11.176259 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90746ff1_95b0_4874_a644_ceac428c45c9.slice/crio-dbee62b1d9a6ce433232dff0265f4dd3957fc6d9f696cd18ce3dfbeeb9bbe265 WatchSource:0}: Error finding container dbee62b1d9a6ce433232dff0265f4dd3957fc6d9f696cd18ce3dfbeeb9bbe265: Status 404 returned error can't find the container with id dbee62b1d9a6ce433232dff0265f4dd3957fc6d9f696cd18ce3dfbeeb9bbe265 Sep 29 19:38:11 crc kubenswrapper[4780]: I0929 19:38:11.829038 4780 generic.go:334] "Generic (PLEG): container finished" podID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerID="b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7" exitCode=0 Sep 29 19:38:11 crc kubenswrapper[4780]: I0929 19:38:11.829290 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjhgj" event={"ID":"2b4d624b-a733-46fa-b7ea-e3f311882f4b","Type":"ContainerDied","Data":"b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7"} Sep 29 19:38:11 crc kubenswrapper[4780]: I0929 19:38:11.829336 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjhgj" event={"ID":"2b4d624b-a733-46fa-b7ea-e3f311882f4b","Type":"ContainerStarted","Data":"c86a579fc067da608f446b22f6506b0d9eec18df0d56e22dca11d51424433e1d"} Sep 29 19:38:11 crc kubenswrapper[4780]: I0929 19:38:11.832650 4780 generic.go:334] "Generic (PLEG): container finished" podID="90746ff1-95b0-4874-a644-ceac428c45c9" containerID="0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f" exitCode=0 Sep 29 19:38:11 crc kubenswrapper[4780]: I0929 19:38:11.832721 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flrv9" event={"ID":"90746ff1-95b0-4874-a644-ceac428c45c9","Type":"ContainerDied","Data":"0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f"} Sep 29 19:38:11 crc kubenswrapper[4780]: I0929 19:38:11.832771 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flrv9" event={"ID":"90746ff1-95b0-4874-a644-ceac428c45c9","Type":"ContainerStarted","Data":"dbee62b1d9a6ce433232dff0265f4dd3957fc6d9f696cd18ce3dfbeeb9bbe265"} Sep 29 19:38:13 crc kubenswrapper[4780]: I0929 19:38:13.859739 4780 generic.go:334] "Generic (PLEG): container finished" podID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerID="c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc" exitCode=0 Sep 29 19:38:13 crc kubenswrapper[4780]: I0929 19:38:13.859838 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjhgj" event={"ID":"2b4d624b-a733-46fa-b7ea-e3f311882f4b","Type":"ContainerDied","Data":"c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc"} Sep 29 19:38:13 crc kubenswrapper[4780]: I0929 19:38:13.865857 4780 generic.go:334] "Generic (PLEG): container finished" podID="90746ff1-95b0-4874-a644-ceac428c45c9" containerID="c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d" exitCode=0 Sep 29 19:38:13 crc kubenswrapper[4780]: I0929 19:38:13.865922 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flrv9" event={"ID":"90746ff1-95b0-4874-a644-ceac428c45c9","Type":"ContainerDied","Data":"c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d"} Sep 29 19:38:14 crc kubenswrapper[4780]: I0929 19:38:14.881769 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flrv9" event={"ID":"90746ff1-95b0-4874-a644-ceac428c45c9","Type":"ContainerStarted","Data":"84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be"} Sep 29 19:38:14 crc kubenswrapper[4780]: I0929 19:38:14.888499 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjhgj" event={"ID":"2b4d624b-a733-46fa-b7ea-e3f311882f4b","Type":"ContainerStarted","Data":"de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829"} Sep 29 19:38:14 crc kubenswrapper[4780]: I0929 19:38:14.910798 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-flrv9" podStartSLOduration=2.46542173 podStartE2EDuration="4.910775174s" podCreationTimestamp="2025-09-29 19:38:10 +0000 UTC" firstStartedPulling="2025-09-29 19:38:11.836823395 +0000 UTC m=+3291.785121479" lastFinishedPulling="2025-09-29 19:38:14.282176839 +0000 UTC m=+3294.230474923" observedRunningTime="2025-09-29 19:38:14.909998752 +0000 UTC m=+3294.858296826" watchObservedRunningTime="2025-09-29 19:38:14.910775174 +0000 UTC m=+3294.859073258" Sep 29 19:38:14 crc kubenswrapper[4780]: I0929 19:38:14.943127 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jjhgj" podStartSLOduration=2.511496584 podStartE2EDuration="4.943095866s" podCreationTimestamp="2025-09-29 19:38:10 +0000 UTC" firstStartedPulling="2025-09-29 19:38:11.834028325 +0000 UTC m=+3291.782326399" lastFinishedPulling="2025-09-29 19:38:14.265627597 +0000 UTC m=+3294.213925681" observedRunningTime="2025-09-29 19:38:14.942153149 +0000 UTC m=+3294.890451223" watchObservedRunningTime="2025-09-29 19:38:14.943095866 +0000 UTC m=+3294.891393950" Sep 29 19:38:20 crc kubenswrapper[4780]: I0929 19:38:20.525464 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:20 crc kubenswrapper[4780]: I0929 19:38:20.526165 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:20 crc kubenswrapper[4780]: I0929 19:38:20.610432 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:20 crc kubenswrapper[4780]: I0929 19:38:20.704625 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:20 crc kubenswrapper[4780]: I0929 19:38:20.705180 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:20 crc kubenswrapper[4780]: I0929 19:38:20.787651 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:21 crc kubenswrapper[4780]: I0929 19:38:21.010029 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:21 crc kubenswrapper[4780]: I0929 19:38:21.030950 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:22 crc kubenswrapper[4780]: I0929 19:38:22.467340 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-flrv9"] Sep 29 19:38:23 crc kubenswrapper[4780]: I0929 19:38:23.457369 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jjhgj"] Sep 29 19:38:23 crc kubenswrapper[4780]: I0929 19:38:23.457863 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jjhgj" podUID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerName="registry-server" containerID="cri-o://de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829" gracePeriod=2 Sep 29 19:38:23 crc kubenswrapper[4780]: I0929 19:38:23.966163 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:23 crc kubenswrapper[4780]: I0929 19:38:23.969031 4780 generic.go:334] "Generic (PLEG): container finished" podID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerID="de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829" exitCode=0 Sep 29 19:38:23 crc kubenswrapper[4780]: I0929 19:38:23.969067 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjhgj" event={"ID":"2b4d624b-a733-46fa-b7ea-e3f311882f4b","Type":"ContainerDied","Data":"de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829"} Sep 29 19:38:23 crc kubenswrapper[4780]: I0929 19:38:23.969157 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjhgj" event={"ID":"2b4d624b-a733-46fa-b7ea-e3f311882f4b","Type":"ContainerDied","Data":"c86a579fc067da608f446b22f6506b0d9eec18df0d56e22dca11d51424433e1d"} Sep 29 19:38:23 crc kubenswrapper[4780]: I0929 19:38:23.969205 4780 scope.go:117] "RemoveContainer" containerID="de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829" Sep 29 19:38:23 crc kubenswrapper[4780]: I0929 19:38:23.969286 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-flrv9" podUID="90746ff1-95b0-4874-a644-ceac428c45c9" containerName="registry-server" containerID="cri-o://84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be" gracePeriod=2 Sep 29 19:38:23 crc kubenswrapper[4780]: I0929 19:38:23.988591 4780 scope.go:117] "RemoveContainer" containerID="c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.019888 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-catalog-content\") pod \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.021930 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-utilities\") pod \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.022002 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb2px\" (UniqueName: \"kubernetes.io/projected/2b4d624b-a733-46fa-b7ea-e3f311882f4b-kube-api-access-nb2px\") pod \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\" (UID: \"2b4d624b-a733-46fa-b7ea-e3f311882f4b\") " Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.023958 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-utilities" (OuterVolumeSpecName: "utilities") pod "2b4d624b-a733-46fa-b7ea-e3f311882f4b" (UID: "2b4d624b-a733-46fa-b7ea-e3f311882f4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.028786 4780 scope.go:117] "RemoveContainer" containerID="b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.047109 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4d624b-a733-46fa-b7ea-e3f311882f4b-kube-api-access-nb2px" (OuterVolumeSpecName: "kube-api-access-nb2px") pod "2b4d624b-a733-46fa-b7ea-e3f311882f4b" (UID: "2b4d624b-a733-46fa-b7ea-e3f311882f4b"). InnerVolumeSpecName "kube-api-access-nb2px". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.090005 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b4d624b-a733-46fa-b7ea-e3f311882f4b" (UID: "2b4d624b-a733-46fa-b7ea-e3f311882f4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.123451 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.123483 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4d624b-a733-46fa-b7ea-e3f311882f4b-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.123491 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb2px\" (UniqueName: \"kubernetes.io/projected/2b4d624b-a733-46fa-b7ea-e3f311882f4b-kube-api-access-nb2px\") on node \"crc\" DevicePath \"\"" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.123603 4780 scope.go:117] "RemoveContainer" containerID="de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829" Sep 29 19:38:24 crc kubenswrapper[4780]: E0929 19:38:24.124115 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829\": container with ID starting with de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829 not found: ID does not exist" containerID="de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.124153 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829"} err="failed to get container status \"de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829\": rpc error: code = NotFound desc = could not find container \"de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829\": container with ID starting with de3030062d9650bb1d8661c0b3de34515c27234586c26fad834bb8e664f56829 not found: ID does not exist" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.124180 4780 scope.go:117] "RemoveContainer" containerID="c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc" Sep 29 19:38:24 crc kubenswrapper[4780]: E0929 19:38:24.124819 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc\": container with ID starting with c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc not found: ID does not exist" containerID="c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.124837 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc"} err="failed to get container status \"c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc\": rpc error: code = NotFound desc = could not find container \"c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc\": container with ID starting with c0254e99b57e8b26cbdfa5d6b838738fe5beeb0102fa19668abe434702fe55bc not found: ID does not exist" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.124850 4780 scope.go:117] "RemoveContainer" containerID="b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7" Sep 29 19:38:24 crc kubenswrapper[4780]: E0929 19:38:24.125194 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7\": container with ID starting with b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7 not found: ID does not exist" containerID="b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.125269 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7"} err="failed to get container status \"b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7\": rpc error: code = NotFound desc = could not find container \"b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7\": container with ID starting with b3fef8d6f1fe080f4de223f659ada1ee185903ca53f926bd38996a95287a25a7 not found: ID does not exist" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.350126 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.428815 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-utilities\") pod \"90746ff1-95b0-4874-a644-ceac428c45c9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.428901 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm5tq\" (UniqueName: \"kubernetes.io/projected/90746ff1-95b0-4874-a644-ceac428c45c9-kube-api-access-vm5tq\") pod \"90746ff1-95b0-4874-a644-ceac428c45c9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.428991 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-catalog-content\") pod \"90746ff1-95b0-4874-a644-ceac428c45c9\" (UID: \"90746ff1-95b0-4874-a644-ceac428c45c9\") " Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.430433 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-utilities" (OuterVolumeSpecName: "utilities") pod "90746ff1-95b0-4874-a644-ceac428c45c9" (UID: "90746ff1-95b0-4874-a644-ceac428c45c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.436498 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90746ff1-95b0-4874-a644-ceac428c45c9-kube-api-access-vm5tq" (OuterVolumeSpecName: "kube-api-access-vm5tq") pod "90746ff1-95b0-4874-a644-ceac428c45c9" (UID: "90746ff1-95b0-4874-a644-ceac428c45c9"). InnerVolumeSpecName "kube-api-access-vm5tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.482865 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90746ff1-95b0-4874-a644-ceac428c45c9" (UID: "90746ff1-95b0-4874-a644-ceac428c45c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.531120 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.531153 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm5tq\" (UniqueName: \"kubernetes.io/projected/90746ff1-95b0-4874-a644-ceac428c45c9-kube-api-access-vm5tq\") on node \"crc\" DevicePath \"\"" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.531167 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90746ff1-95b0-4874-a644-ceac428c45c9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.988485 4780 generic.go:334] "Generic (PLEG): container finished" podID="90746ff1-95b0-4874-a644-ceac428c45c9" containerID="84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be" exitCode=0 Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.988581 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-flrv9" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.988578 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flrv9" event={"ID":"90746ff1-95b0-4874-a644-ceac428c45c9","Type":"ContainerDied","Data":"84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be"} Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.989146 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flrv9" event={"ID":"90746ff1-95b0-4874-a644-ceac428c45c9","Type":"ContainerDied","Data":"dbee62b1d9a6ce433232dff0265f4dd3957fc6d9f696cd18ce3dfbeeb9bbe265"} Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.989181 4780 scope.go:117] "RemoveContainer" containerID="84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be" Sep 29 19:38:24 crc kubenswrapper[4780]: I0929 19:38:24.991950 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjhgj" Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.027819 4780 scope.go:117] "RemoveContainer" containerID="c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d" Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.052377 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-flrv9"] Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.058984 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-flrv9"] Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.065224 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jjhgj"] Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.070468 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jjhgj"] Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.072717 4780 scope.go:117] "RemoveContainer" containerID="0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f" Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.109630 4780 scope.go:117] "RemoveContainer" containerID="84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be" Sep 29 19:38:25 crc kubenswrapper[4780]: E0929 19:38:25.110324 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be\": container with ID starting with 84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be not found: ID does not exist" containerID="84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be" Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.110391 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be"} err="failed to get container status \"84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be\": rpc error: code = NotFound desc = could not find container \"84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be\": container with ID starting with 84cad839f8a4a59006879706798764c659a5e084a6da4e4715d89c0874ebd6be not found: ID does not exist" Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.110436 4780 scope.go:117] "RemoveContainer" containerID="c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d" Sep 29 19:38:25 crc kubenswrapper[4780]: E0929 19:38:25.111129 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d\": container with ID starting with c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d not found: ID does not exist" containerID="c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d" Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.111197 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d"} err="failed to get container status \"c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d\": rpc error: code = NotFound desc = could not find container \"c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d\": container with ID starting with c82c37995b92983599e1daa5a651beb92d1dcff3d8309f5e5629948b816a9a4d not found: ID does not exist" Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.111244 4780 scope.go:117] "RemoveContainer" containerID="0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f" Sep 29 19:38:25 crc kubenswrapper[4780]: E0929 19:38:25.111754 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f\": container with ID starting with 0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f not found: ID does not exist" containerID="0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f" Sep 29 19:38:25 crc kubenswrapper[4780]: I0929 19:38:25.111804 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f"} err="failed to get container status \"0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f\": rpc error: code = NotFound desc = could not find container \"0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f\": container with ID starting with 0c53f17ea7b533fdb64997205fb9dc4d4ff586c13ce2db764b0bf6447c61146f not found: ID does not exist" Sep 29 19:38:26 crc kubenswrapper[4780]: I0929 19:38:26.770702 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" path="/var/lib/kubelet/pods/2b4d624b-a733-46fa-b7ea-e3f311882f4b/volumes" Sep 29 19:38:26 crc kubenswrapper[4780]: I0929 19:38:26.771975 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90746ff1-95b0-4874-a644-ceac428c45c9" path="/var/lib/kubelet/pods/90746ff1-95b0-4874-a644-ceac428c45c9/volumes" Sep 29 19:39:03 crc kubenswrapper[4780]: I0929 19:39:03.223368 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:39:03 crc kubenswrapper[4780]: I0929 19:39:03.226148 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:39:33 crc kubenswrapper[4780]: I0929 19:39:33.223289 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:39:33 crc kubenswrapper[4780]: I0929 19:39:33.223928 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:40:03 crc kubenswrapper[4780]: I0929 19:40:03.223769 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:40:03 crc kubenswrapper[4780]: I0929 19:40:03.224704 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:40:03 crc kubenswrapper[4780]: I0929 19:40:03.224897 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:40:03 crc kubenswrapper[4780]: I0929 19:40:03.226710 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c59d71e8033c2ec020a0ce0431260f0e9ec2443ef890afde743461c1490c312"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:40:03 crc kubenswrapper[4780]: I0929 19:40:03.226841 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://5c59d71e8033c2ec020a0ce0431260f0e9ec2443ef890afde743461c1490c312" gracePeriod=600 Sep 29 19:40:04 crc kubenswrapper[4780]: I0929 19:40:04.018655 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="5c59d71e8033c2ec020a0ce0431260f0e9ec2443ef890afde743461c1490c312" exitCode=0 Sep 29 19:40:04 crc kubenswrapper[4780]: I0929 19:40:04.018788 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"5c59d71e8033c2ec020a0ce0431260f0e9ec2443ef890afde743461c1490c312"} Sep 29 19:40:04 crc kubenswrapper[4780]: I0929 19:40:04.019509 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95"} Sep 29 19:40:04 crc kubenswrapper[4780]: I0929 19:40:04.019545 4780 scope.go:117] "RemoveContainer" containerID="87b69e79c259570f0f284864fc827fc174a1789bce754d70dd403b37a02f1e1e" Sep 29 19:42:03 crc kubenswrapper[4780]: I0929 19:42:03.223375 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:42:03 crc kubenswrapper[4780]: I0929 19:42:03.224245 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:42:33 crc kubenswrapper[4780]: I0929 19:42:33.223791 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:42:33 crc kubenswrapper[4780]: I0929 19:42:33.224700 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:43:03 crc kubenswrapper[4780]: I0929 19:43:03.223858 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:43:03 crc kubenswrapper[4780]: I0929 19:43:03.224857 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:43:03 crc kubenswrapper[4780]: I0929 19:43:03.224933 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:43:03 crc kubenswrapper[4780]: I0929 19:43:03.226897 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:43:03 crc kubenswrapper[4780]: I0929 19:43:03.227013 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" gracePeriod=600 Sep 29 19:43:03 crc kubenswrapper[4780]: E0929 19:43:03.352404 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:43:03 crc kubenswrapper[4780]: I0929 19:43:03.835277 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" exitCode=0 Sep 29 19:43:03 crc kubenswrapper[4780]: I0929 19:43:03.835361 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95"} Sep 29 19:43:03 crc kubenswrapper[4780]: I0929 19:43:03.835414 4780 scope.go:117] "RemoveContainer" containerID="5c59d71e8033c2ec020a0ce0431260f0e9ec2443ef890afde743461c1490c312" Sep 29 19:43:03 crc kubenswrapper[4780]: I0929 19:43:03.836163 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:43:03 crc kubenswrapper[4780]: E0929 19:43:03.837093 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:43:14 crc kubenswrapper[4780]: I0929 19:43:14.754419 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:43:14 crc kubenswrapper[4780]: E0929 19:43:14.755169 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:43:28 crc kubenswrapper[4780]: I0929 19:43:28.753223 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:43:28 crc kubenswrapper[4780]: E0929 19:43:28.754207 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.593581 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zzdz6"] Sep 29 19:43:29 crc kubenswrapper[4780]: E0929 19:43:29.594166 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90746ff1-95b0-4874-a644-ceac428c45c9" containerName="extract-content" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.594203 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="90746ff1-95b0-4874-a644-ceac428c45c9" containerName="extract-content" Sep 29 19:43:29 crc kubenswrapper[4780]: E0929 19:43:29.594240 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90746ff1-95b0-4874-a644-ceac428c45c9" containerName="registry-server" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.594257 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="90746ff1-95b0-4874-a644-ceac428c45c9" containerName="registry-server" Sep 29 19:43:29 crc kubenswrapper[4780]: E0929 19:43:29.594285 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerName="extract-content" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.594303 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerName="extract-content" Sep 29 19:43:29 crc kubenswrapper[4780]: E0929 19:43:29.594341 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90746ff1-95b0-4874-a644-ceac428c45c9" containerName="extract-utilities" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.594358 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="90746ff1-95b0-4874-a644-ceac428c45c9" containerName="extract-utilities" Sep 29 19:43:29 crc kubenswrapper[4780]: E0929 19:43:29.594380 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerName="registry-server" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.594395 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerName="registry-server" Sep 29 19:43:29 crc kubenswrapper[4780]: E0929 19:43:29.594447 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerName="extract-utilities" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.594462 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerName="extract-utilities" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.594750 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="90746ff1-95b0-4874-a644-ceac428c45c9" containerName="registry-server" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.594778 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4d624b-a733-46fa-b7ea-e3f311882f4b" containerName="registry-server" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.596965 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.612040 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zzdz6"] Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.691744 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d72rg\" (UniqueName: \"kubernetes.io/projected/36926dc1-2054-4816-aa23-5716de4e9b78-kube-api-access-d72rg\") pod \"redhat-operators-zzdz6\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.692135 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-utilities\") pod \"redhat-operators-zzdz6\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.692345 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-catalog-content\") pod \"redhat-operators-zzdz6\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.794149 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-utilities\") pod \"redhat-operators-zzdz6\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.794213 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-catalog-content\") pod \"redhat-operators-zzdz6\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.794312 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d72rg\" (UniqueName: \"kubernetes.io/projected/36926dc1-2054-4816-aa23-5716de4e9b78-kube-api-access-d72rg\") pod \"redhat-operators-zzdz6\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.795016 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-utilities\") pod \"redhat-operators-zzdz6\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.795139 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-catalog-content\") pod \"redhat-operators-zzdz6\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.820890 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72rg\" (UniqueName: \"kubernetes.io/projected/36926dc1-2054-4816-aa23-5716de4e9b78-kube-api-access-d72rg\") pod \"redhat-operators-zzdz6\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:29 crc kubenswrapper[4780]: I0929 19:43:29.935310 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:30 crc kubenswrapper[4780]: I0929 19:43:30.398007 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zzdz6"] Sep 29 19:43:31 crc kubenswrapper[4780]: I0929 19:43:31.071203 4780 generic.go:334] "Generic (PLEG): container finished" podID="36926dc1-2054-4816-aa23-5716de4e9b78" containerID="a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8" exitCode=0 Sep 29 19:43:31 crc kubenswrapper[4780]: I0929 19:43:31.071276 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzdz6" event={"ID":"36926dc1-2054-4816-aa23-5716de4e9b78","Type":"ContainerDied","Data":"a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8"} Sep 29 19:43:31 crc kubenswrapper[4780]: I0929 19:43:31.071567 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzdz6" event={"ID":"36926dc1-2054-4816-aa23-5716de4e9b78","Type":"ContainerStarted","Data":"63b4b656122af1c50a98015432c9f5c9861bee54c78bfcdd8fc350d0775fff08"} Sep 29 19:43:31 crc kubenswrapper[4780]: I0929 19:43:31.073595 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 19:43:32 crc kubenswrapper[4780]: I0929 19:43:32.084525 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzdz6" event={"ID":"36926dc1-2054-4816-aa23-5716de4e9b78","Type":"ContainerStarted","Data":"36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506"} Sep 29 19:43:33 crc kubenswrapper[4780]: I0929 19:43:33.111700 4780 generic.go:334] "Generic (PLEG): container finished" podID="36926dc1-2054-4816-aa23-5716de4e9b78" containerID="36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506" exitCode=0 Sep 29 19:43:33 crc kubenswrapper[4780]: I0929 19:43:33.112142 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzdz6" event={"ID":"36926dc1-2054-4816-aa23-5716de4e9b78","Type":"ContainerDied","Data":"36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506"} Sep 29 19:43:34 crc kubenswrapper[4780]: I0929 19:43:34.127203 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzdz6" event={"ID":"36926dc1-2054-4816-aa23-5716de4e9b78","Type":"ContainerStarted","Data":"a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff"} Sep 29 19:43:34 crc kubenswrapper[4780]: I0929 19:43:34.169827 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zzdz6" podStartSLOduration=2.699758472 podStartE2EDuration="5.16979236s" podCreationTimestamp="2025-09-29 19:43:29 +0000 UTC" firstStartedPulling="2025-09-29 19:43:31.073214015 +0000 UTC m=+3611.021512099" lastFinishedPulling="2025-09-29 19:43:33.543247903 +0000 UTC m=+3613.491545987" observedRunningTime="2025-09-29 19:43:34.155775554 +0000 UTC m=+3614.104073638" watchObservedRunningTime="2025-09-29 19:43:34.16979236 +0000 UTC m=+3614.118090444" Sep 29 19:43:39 crc kubenswrapper[4780]: I0929 19:43:39.935999 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:39 crc kubenswrapper[4780]: I0929 19:43:39.936759 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:40 crc kubenswrapper[4780]: I0929 19:43:40.007814 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:40 crc kubenswrapper[4780]: I0929 19:43:40.257008 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:40 crc kubenswrapper[4780]: I0929 19:43:40.322601 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zzdz6"] Sep 29 19:43:41 crc kubenswrapper[4780]: I0929 19:43:41.753487 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:43:41 crc kubenswrapper[4780]: E0929 19:43:41.753876 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:43:42 crc kubenswrapper[4780]: I0929 19:43:42.208973 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zzdz6" podUID="36926dc1-2054-4816-aa23-5716de4e9b78" containerName="registry-server" containerID="cri-o://a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff" gracePeriod=2 Sep 29 19:43:43 crc kubenswrapper[4780]: I0929 19:43:43.820147 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:43 crc kubenswrapper[4780]: I0929 19:43:43.918592 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-catalog-content\") pod \"36926dc1-2054-4816-aa23-5716de4e9b78\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " Sep 29 19:43:43 crc kubenswrapper[4780]: I0929 19:43:43.918654 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-utilities\") pod \"36926dc1-2054-4816-aa23-5716de4e9b78\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " Sep 29 19:43:43 crc kubenswrapper[4780]: I0929 19:43:43.918730 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d72rg\" (UniqueName: \"kubernetes.io/projected/36926dc1-2054-4816-aa23-5716de4e9b78-kube-api-access-d72rg\") pod \"36926dc1-2054-4816-aa23-5716de4e9b78\" (UID: \"36926dc1-2054-4816-aa23-5716de4e9b78\") " Sep 29 19:43:43 crc kubenswrapper[4780]: I0929 19:43:43.919792 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-utilities" (OuterVolumeSpecName: "utilities") pod "36926dc1-2054-4816-aa23-5716de4e9b78" (UID: "36926dc1-2054-4816-aa23-5716de4e9b78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:43:43 crc kubenswrapper[4780]: I0929 19:43:43.930094 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36926dc1-2054-4816-aa23-5716de4e9b78-kube-api-access-d72rg" (OuterVolumeSpecName: "kube-api-access-d72rg") pod "36926dc1-2054-4816-aa23-5716de4e9b78" (UID: "36926dc1-2054-4816-aa23-5716de4e9b78"). InnerVolumeSpecName "kube-api-access-d72rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.021439 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.021500 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d72rg\" (UniqueName: \"kubernetes.io/projected/36926dc1-2054-4816-aa23-5716de4e9b78-kube-api-access-d72rg\") on node \"crc\" DevicePath \"\"" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.043371 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36926dc1-2054-4816-aa23-5716de4e9b78" (UID: "36926dc1-2054-4816-aa23-5716de4e9b78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.124497 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36926dc1-2054-4816-aa23-5716de4e9b78-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.234654 4780 generic.go:334] "Generic (PLEG): container finished" podID="36926dc1-2054-4816-aa23-5716de4e9b78" containerID="a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff" exitCode=0 Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.234754 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzdz6" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.234749 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzdz6" event={"ID":"36926dc1-2054-4816-aa23-5716de4e9b78","Type":"ContainerDied","Data":"a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff"} Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.234951 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzdz6" event={"ID":"36926dc1-2054-4816-aa23-5716de4e9b78","Type":"ContainerDied","Data":"63b4b656122af1c50a98015432c9f5c9861bee54c78bfcdd8fc350d0775fff08"} Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.234993 4780 scope.go:117] "RemoveContainer" containerID="a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.281163 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zzdz6"] Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.285949 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zzdz6"] Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.292208 4780 scope.go:117] "RemoveContainer" containerID="36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.321325 4780 scope.go:117] "RemoveContainer" containerID="a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.360956 4780 scope.go:117] "RemoveContainer" containerID="a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff" Sep 29 19:43:44 crc kubenswrapper[4780]: E0929 19:43:44.361809 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff\": container with ID starting with a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff not found: ID does not exist" containerID="a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.361884 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff"} err="failed to get container status \"a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff\": rpc error: code = NotFound desc = could not find container \"a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff\": container with ID starting with a5c078a00a3af2953e78a18b32342a9138eb6a736733b43f605b1c68b17366ff not found: ID does not exist" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.361929 4780 scope.go:117] "RemoveContainer" containerID="36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506" Sep 29 19:43:44 crc kubenswrapper[4780]: E0929 19:43:44.362541 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506\": container with ID starting with 36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506 not found: ID does not exist" containerID="36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.362587 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506"} err="failed to get container status \"36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506\": rpc error: code = NotFound desc = could not find container \"36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506\": container with ID starting with 36232880050b3268817ed37371d694f43acdcb361c66fb07c4cf9386146c5506 not found: ID does not exist" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.362624 4780 scope.go:117] "RemoveContainer" containerID="a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8" Sep 29 19:43:44 crc kubenswrapper[4780]: E0929 19:43:44.363096 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8\": container with ID starting with a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8 not found: ID does not exist" containerID="a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.363145 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8"} err="failed to get container status \"a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8\": rpc error: code = NotFound desc = could not find container \"a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8\": container with ID starting with a68ed2e5be5e189fc477ab4e4836cb2a6db1010d11cf9b7087501efbef5e63e8 not found: ID does not exist" Sep 29 19:43:44 crc kubenswrapper[4780]: I0929 19:43:44.773111 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36926dc1-2054-4816-aa23-5716de4e9b78" path="/var/lib/kubelet/pods/36926dc1-2054-4816-aa23-5716de4e9b78/volumes" Sep 29 19:43:54 crc kubenswrapper[4780]: I0929 19:43:54.753921 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:43:54 crc kubenswrapper[4780]: E0929 19:43:54.754864 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.700284 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p9285"] Sep 29 19:44:05 crc kubenswrapper[4780]: E0929 19:44:05.702159 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36926dc1-2054-4816-aa23-5716de4e9b78" containerName="registry-server" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.702307 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="36926dc1-2054-4816-aa23-5716de4e9b78" containerName="registry-server" Sep 29 19:44:05 crc kubenswrapper[4780]: E0929 19:44:05.702345 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36926dc1-2054-4816-aa23-5716de4e9b78" containerName="extract-content" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.702814 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="36926dc1-2054-4816-aa23-5716de4e9b78" containerName="extract-content" Sep 29 19:44:05 crc kubenswrapper[4780]: E0929 19:44:05.702869 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36926dc1-2054-4816-aa23-5716de4e9b78" containerName="extract-utilities" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.702884 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="36926dc1-2054-4816-aa23-5716de4e9b78" containerName="extract-utilities" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.703747 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="36926dc1-2054-4816-aa23-5716de4e9b78" containerName="registry-server" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.709868 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.717590 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9285"] Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.753532 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:44:05 crc kubenswrapper[4780]: E0929 19:44:05.753780 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.803442 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-catalog-content\") pod \"redhat-marketplace-p9285\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.803514 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6g58\" (UniqueName: \"kubernetes.io/projected/70f33a86-f77f-46ca-9946-24cbe569b1ee-kube-api-access-j6g58\") pod \"redhat-marketplace-p9285\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.803549 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-utilities\") pod \"redhat-marketplace-p9285\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.905364 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-catalog-content\") pod \"redhat-marketplace-p9285\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.905457 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6g58\" (UniqueName: \"kubernetes.io/projected/70f33a86-f77f-46ca-9946-24cbe569b1ee-kube-api-access-j6g58\") pod \"redhat-marketplace-p9285\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.905511 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-utilities\") pod \"redhat-marketplace-p9285\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.905961 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-catalog-content\") pod \"redhat-marketplace-p9285\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.906280 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-utilities\") pod \"redhat-marketplace-p9285\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:05 crc kubenswrapper[4780]: I0929 19:44:05.927247 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6g58\" (UniqueName: \"kubernetes.io/projected/70f33a86-f77f-46ca-9946-24cbe569b1ee-kube-api-access-j6g58\") pod \"redhat-marketplace-p9285\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:06 crc kubenswrapper[4780]: I0929 19:44:06.050946 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:06 crc kubenswrapper[4780]: I0929 19:44:06.498909 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9285"] Sep 29 19:44:07 crc kubenswrapper[4780]: I0929 19:44:07.446006 4780 generic.go:334] "Generic (PLEG): container finished" podID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerID="665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1" exitCode=0 Sep 29 19:44:07 crc kubenswrapper[4780]: I0929 19:44:07.446204 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9285" event={"ID":"70f33a86-f77f-46ca-9946-24cbe569b1ee","Type":"ContainerDied","Data":"665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1"} Sep 29 19:44:07 crc kubenswrapper[4780]: I0929 19:44:07.446304 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9285" event={"ID":"70f33a86-f77f-46ca-9946-24cbe569b1ee","Type":"ContainerStarted","Data":"c8330add92d51298812f215c759ea3654e2d74e84ba78b8e1c1c364b6e8e7567"} Sep 29 19:44:09 crc kubenswrapper[4780]: I0929 19:44:09.468006 4780 generic.go:334] "Generic (PLEG): container finished" podID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerID="e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d" exitCode=0 Sep 29 19:44:09 crc kubenswrapper[4780]: I0929 19:44:09.468130 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9285" event={"ID":"70f33a86-f77f-46ca-9946-24cbe569b1ee","Type":"ContainerDied","Data":"e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d"} Sep 29 19:44:10 crc kubenswrapper[4780]: I0929 19:44:10.481917 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9285" event={"ID":"70f33a86-f77f-46ca-9946-24cbe569b1ee","Type":"ContainerStarted","Data":"0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb"} Sep 29 19:44:10 crc kubenswrapper[4780]: I0929 19:44:10.504686 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p9285" podStartSLOduration=2.959390412 podStartE2EDuration="5.504661134s" podCreationTimestamp="2025-09-29 19:44:05 +0000 UTC" firstStartedPulling="2025-09-29 19:44:07.448253424 +0000 UTC m=+3647.396551478" lastFinishedPulling="2025-09-29 19:44:09.993524116 +0000 UTC m=+3649.941822200" observedRunningTime="2025-09-29 19:44:10.501929997 +0000 UTC m=+3650.450228051" watchObservedRunningTime="2025-09-29 19:44:10.504661134 +0000 UTC m=+3650.452959188" Sep 29 19:44:16 crc kubenswrapper[4780]: I0929 19:44:16.052146 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:16 crc kubenswrapper[4780]: I0929 19:44:16.052521 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:16 crc kubenswrapper[4780]: I0929 19:44:16.112419 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:16 crc kubenswrapper[4780]: I0929 19:44:16.633340 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:16 crc kubenswrapper[4780]: I0929 19:44:16.704458 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9285"] Sep 29 19:44:18 crc kubenswrapper[4780]: I0929 19:44:18.575000 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p9285" podUID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerName="registry-server" containerID="cri-o://0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb" gracePeriod=2 Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.035166 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.118879 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-catalog-content\") pod \"70f33a86-f77f-46ca-9946-24cbe569b1ee\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.118960 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-utilities\") pod \"70f33a86-f77f-46ca-9946-24cbe569b1ee\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.119021 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6g58\" (UniqueName: \"kubernetes.io/projected/70f33a86-f77f-46ca-9946-24cbe569b1ee-kube-api-access-j6g58\") pod \"70f33a86-f77f-46ca-9946-24cbe569b1ee\" (UID: \"70f33a86-f77f-46ca-9946-24cbe569b1ee\") " Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.120225 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-utilities" (OuterVolumeSpecName: "utilities") pod "70f33a86-f77f-46ca-9946-24cbe569b1ee" (UID: "70f33a86-f77f-46ca-9946-24cbe569b1ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.124145 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f33a86-f77f-46ca-9946-24cbe569b1ee-kube-api-access-j6g58" (OuterVolumeSpecName: "kube-api-access-j6g58") pod "70f33a86-f77f-46ca-9946-24cbe569b1ee" (UID: "70f33a86-f77f-46ca-9946-24cbe569b1ee"). InnerVolumeSpecName "kube-api-access-j6g58". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.139261 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70f33a86-f77f-46ca-9946-24cbe569b1ee" (UID: "70f33a86-f77f-46ca-9946-24cbe569b1ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.221022 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.221080 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70f33a86-f77f-46ca-9946-24cbe569b1ee-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.221094 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6g58\" (UniqueName: \"kubernetes.io/projected/70f33a86-f77f-46ca-9946-24cbe569b1ee-kube-api-access-j6g58\") on node \"crc\" DevicePath \"\"" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.588359 4780 generic.go:334] "Generic (PLEG): container finished" podID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerID="0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb" exitCode=0 Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.588424 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9285" event={"ID":"70f33a86-f77f-46ca-9946-24cbe569b1ee","Type":"ContainerDied","Data":"0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb"} Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.588462 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9285" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.588490 4780 scope.go:117] "RemoveContainer" containerID="0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.588469 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9285" event={"ID":"70f33a86-f77f-46ca-9946-24cbe569b1ee","Type":"ContainerDied","Data":"c8330add92d51298812f215c759ea3654e2d74e84ba78b8e1c1c364b6e8e7567"} Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.617410 4780 scope.go:117] "RemoveContainer" containerID="e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.648031 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9285"] Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.658659 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9285"] Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.680174 4780 scope.go:117] "RemoveContainer" containerID="665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.708556 4780 scope.go:117] "RemoveContainer" containerID="0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb" Sep 29 19:44:19 crc kubenswrapper[4780]: E0929 19:44:19.709084 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb\": container with ID starting with 0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb not found: ID does not exist" containerID="0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.709126 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb"} err="failed to get container status \"0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb\": rpc error: code = NotFound desc = could not find container \"0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb\": container with ID starting with 0800e806fe95901794c513ac7bb05791905e9f5f7d19a9c647fbc6df547904fb not found: ID does not exist" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.709152 4780 scope.go:117] "RemoveContainer" containerID="e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d" Sep 29 19:44:19 crc kubenswrapper[4780]: E0929 19:44:19.709557 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d\": container with ID starting with e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d not found: ID does not exist" containerID="e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.709613 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d"} err="failed to get container status \"e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d\": rpc error: code = NotFound desc = could not find container \"e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d\": container with ID starting with e14be26209f46c133e7701cf4da824355b5417d3b3fb48e30320a21f314b757d not found: ID does not exist" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.709661 4780 scope.go:117] "RemoveContainer" containerID="665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1" Sep 29 19:44:19 crc kubenswrapper[4780]: E0929 19:44:19.710118 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1\": container with ID starting with 665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1 not found: ID does not exist" containerID="665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1" Sep 29 19:44:19 crc kubenswrapper[4780]: I0929 19:44:19.710149 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1"} err="failed to get container status \"665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1\": rpc error: code = NotFound desc = could not find container \"665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1\": container with ID starting with 665b625fcb97ba4296e1c8b34c606b84757c007510e5bb08868860f75c940de1 not found: ID does not exist" Sep 29 19:44:20 crc kubenswrapper[4780]: I0929 19:44:20.760296 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:44:20 crc kubenswrapper[4780]: E0929 19:44:20.760947 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:44:20 crc kubenswrapper[4780]: I0929 19:44:20.782925 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f33a86-f77f-46ca-9946-24cbe569b1ee" path="/var/lib/kubelet/pods/70f33a86-f77f-46ca-9946-24cbe569b1ee/volumes" Sep 29 19:44:33 crc kubenswrapper[4780]: I0929 19:44:33.753296 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:44:33 crc kubenswrapper[4780]: E0929 19:44:33.754094 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:44:46 crc kubenswrapper[4780]: I0929 19:44:46.754213 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:44:46 crc kubenswrapper[4780]: E0929 19:44:46.755372 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:44:59 crc kubenswrapper[4780]: I0929 19:44:59.753840 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:44:59 crc kubenswrapper[4780]: E0929 19:44:59.755183 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.217957 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws"] Sep 29 19:45:00 crc kubenswrapper[4780]: E0929 19:45:00.218559 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerName="extract-content" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.218597 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerName="extract-content" Sep 29 19:45:00 crc kubenswrapper[4780]: E0929 19:45:00.218630 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerName="registry-server" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.218647 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerName="registry-server" Sep 29 19:45:00 crc kubenswrapper[4780]: E0929 19:45:00.218682 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerName="extract-utilities" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.218700 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerName="extract-utilities" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.219090 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f33a86-f77f-46ca-9946-24cbe569b1ee" containerName="registry-server" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.220083 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.223611 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.224725 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.243479 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws"] Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.316811 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-config-volume\") pod \"collect-profiles-29319585-pqsws\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.317119 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prk4q\" (UniqueName: \"kubernetes.io/projected/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-kube-api-access-prk4q\") pod \"collect-profiles-29319585-pqsws\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.317288 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-secret-volume\") pod \"collect-profiles-29319585-pqsws\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.418628 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prk4q\" (UniqueName: \"kubernetes.io/projected/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-kube-api-access-prk4q\") pod \"collect-profiles-29319585-pqsws\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.418724 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-secret-volume\") pod \"collect-profiles-29319585-pqsws\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.418858 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-config-volume\") pod \"collect-profiles-29319585-pqsws\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.420366 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-config-volume\") pod \"collect-profiles-29319585-pqsws\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.427146 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-secret-volume\") pod \"collect-profiles-29319585-pqsws\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.447977 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prk4q\" (UniqueName: \"kubernetes.io/projected/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-kube-api-access-prk4q\") pod \"collect-profiles-29319585-pqsws\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:00 crc kubenswrapper[4780]: I0929 19:45:00.549883 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:01 crc kubenswrapper[4780]: I0929 19:45:01.078934 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws"] Sep 29 19:45:02 crc kubenswrapper[4780]: I0929 19:45:02.039412 4780 generic.go:334] "Generic (PLEG): container finished" podID="2e468c1d-caa0-4d9b-a3ec-4f955a636aaf" containerID="02d3a8146715bdc11224313757996f2cfc10dec04b9fac8c16b441d4d3968fed" exitCode=0 Sep 29 19:45:02 crc kubenswrapper[4780]: I0929 19:45:02.039506 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" event={"ID":"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf","Type":"ContainerDied","Data":"02d3a8146715bdc11224313757996f2cfc10dec04b9fac8c16b441d4d3968fed"} Sep 29 19:45:02 crc kubenswrapper[4780]: I0929 19:45:02.039836 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" event={"ID":"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf","Type":"ContainerStarted","Data":"ed0a649cfc630847b0d23267c9c0a530ec937bfbdb54768c0b700c455ae3cab4"} Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.341925 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.466718 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-config-volume\") pod \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.466808 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prk4q\" (UniqueName: \"kubernetes.io/projected/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-kube-api-access-prk4q\") pod \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.466864 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-secret-volume\") pod \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\" (UID: \"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf\") " Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.469126 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-config-volume" (OuterVolumeSpecName: "config-volume") pod "2e468c1d-caa0-4d9b-a3ec-4f955a636aaf" (UID: "2e468c1d-caa0-4d9b-a3ec-4f955a636aaf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.475744 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-kube-api-access-prk4q" (OuterVolumeSpecName: "kube-api-access-prk4q") pod "2e468c1d-caa0-4d9b-a3ec-4f955a636aaf" (UID: "2e468c1d-caa0-4d9b-a3ec-4f955a636aaf"). InnerVolumeSpecName "kube-api-access-prk4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.476597 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2e468c1d-caa0-4d9b-a3ec-4f955a636aaf" (UID: "2e468c1d-caa0-4d9b-a3ec-4f955a636aaf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.568667 4780 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.568719 4780 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 19:45:03 crc kubenswrapper[4780]: I0929 19:45:03.568740 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prk4q\" (UniqueName: \"kubernetes.io/projected/2e468c1d-caa0-4d9b-a3ec-4f955a636aaf-kube-api-access-prk4q\") on node \"crc\" DevicePath \"\"" Sep 29 19:45:04 crc kubenswrapper[4780]: I0929 19:45:04.062096 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" event={"ID":"2e468c1d-caa0-4d9b-a3ec-4f955a636aaf","Type":"ContainerDied","Data":"ed0a649cfc630847b0d23267c9c0a530ec937bfbdb54768c0b700c455ae3cab4"} Sep 29 19:45:04 crc kubenswrapper[4780]: I0929 19:45:04.062144 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed0a649cfc630847b0d23267c9c0a530ec937bfbdb54768c0b700c455ae3cab4" Sep 29 19:45:04 crc kubenswrapper[4780]: I0929 19:45:04.062158 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319585-pqsws" Sep 29 19:45:04 crc kubenswrapper[4780]: I0929 19:45:04.423371 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw"] Sep 29 19:45:04 crc kubenswrapper[4780]: I0929 19:45:04.427898 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319540-kj8xw"] Sep 29 19:45:04 crc kubenswrapper[4780]: I0929 19:45:04.769889 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fc73004-9d27-4d7e-8082-95d33f7b0fc7" path="/var/lib/kubelet/pods/4fc73004-9d27-4d7e-8082-95d33f7b0fc7/volumes" Sep 29 19:45:13 crc kubenswrapper[4780]: I0929 19:45:13.753592 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:45:13 crc kubenswrapper[4780]: E0929 19:45:13.754588 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:45:26 crc kubenswrapper[4780]: I0929 19:45:26.653902 4780 scope.go:117] "RemoveContainer" containerID="edb2d720a111d6be8d4f3a2b4f81db83411c60a97b324de83c9883841ab42430" Sep 29 19:45:28 crc kubenswrapper[4780]: I0929 19:45:28.754171 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:45:28 crc kubenswrapper[4780]: E0929 19:45:28.754917 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:45:42 crc kubenswrapper[4780]: I0929 19:45:42.753850 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:45:42 crc kubenswrapper[4780]: E0929 19:45:42.757888 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:45:57 crc kubenswrapper[4780]: I0929 19:45:57.752656 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:45:57 crc kubenswrapper[4780]: E0929 19:45:57.754463 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:46:10 crc kubenswrapper[4780]: I0929 19:46:10.760324 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:46:10 crc kubenswrapper[4780]: E0929 19:46:10.761531 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:46:24 crc kubenswrapper[4780]: I0929 19:46:24.753755 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:46:24 crc kubenswrapper[4780]: E0929 19:46:24.754919 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:46:39 crc kubenswrapper[4780]: I0929 19:46:39.753878 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:46:39 crc kubenswrapper[4780]: E0929 19:46:39.758169 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:46:54 crc kubenswrapper[4780]: I0929 19:46:54.753140 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:46:54 crc kubenswrapper[4780]: E0929 19:46:54.753901 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:47:09 crc kubenswrapper[4780]: I0929 19:47:09.752933 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:47:09 crc kubenswrapper[4780]: E0929 19:47:09.754178 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:47:20 crc kubenswrapper[4780]: I0929 19:47:20.765666 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:47:20 crc kubenswrapper[4780]: E0929 19:47:20.767033 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:47:31 crc kubenswrapper[4780]: I0929 19:47:31.754325 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:47:31 crc kubenswrapper[4780]: E0929 19:47:31.755200 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:47:44 crc kubenswrapper[4780]: I0929 19:47:44.754771 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:47:44 crc kubenswrapper[4780]: E0929 19:47:44.756016 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:47:57 crc kubenswrapper[4780]: I0929 19:47:57.753654 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:47:57 crc kubenswrapper[4780]: E0929 19:47:57.754868 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:48:09 crc kubenswrapper[4780]: I0929 19:48:09.754165 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:48:10 crc kubenswrapper[4780]: I0929 19:48:10.941833 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"eb5b4c1b5bd8e2acb286e71cc95172b4e26e921668ff5e3ae3857289dd03f33c"} Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.600486 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2vkm5"] Sep 29 19:48:22 crc kubenswrapper[4780]: E0929 19:48:22.601576 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e468c1d-caa0-4d9b-a3ec-4f955a636aaf" containerName="collect-profiles" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.601601 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e468c1d-caa0-4d9b-a3ec-4f955a636aaf" containerName="collect-profiles" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.601827 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e468c1d-caa0-4d9b-a3ec-4f955a636aaf" containerName="collect-profiles" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.603618 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.617742 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2vkm5"] Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.654649 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-catalog-content\") pod \"certified-operators-2vkm5\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.654725 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r58j\" (UniqueName: \"kubernetes.io/projected/ec7aa340-e8e1-48ad-973d-1c951baa3906-kube-api-access-5r58j\") pod \"certified-operators-2vkm5\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.654883 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-utilities\") pod \"certified-operators-2vkm5\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.756525 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-catalog-content\") pod \"certified-operators-2vkm5\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.756582 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r58j\" (UniqueName: \"kubernetes.io/projected/ec7aa340-e8e1-48ad-973d-1c951baa3906-kube-api-access-5r58j\") pod \"certified-operators-2vkm5\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.757188 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-utilities\") pod \"certified-operators-2vkm5\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.757133 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-catalog-content\") pod \"certified-operators-2vkm5\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.757934 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-utilities\") pod \"certified-operators-2vkm5\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.782749 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r58j\" (UniqueName: \"kubernetes.io/projected/ec7aa340-e8e1-48ad-973d-1c951baa3906-kube-api-access-5r58j\") pod \"certified-operators-2vkm5\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:22 crc kubenswrapper[4780]: I0929 19:48:22.932250 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:23 crc kubenswrapper[4780]: I0929 19:48:23.554042 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2vkm5"] Sep 29 19:48:24 crc kubenswrapper[4780]: I0929 19:48:24.064180 4780 generic.go:334] "Generic (PLEG): container finished" podID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerID="5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1" exitCode=0 Sep 29 19:48:24 crc kubenswrapper[4780]: I0929 19:48:24.064290 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vkm5" event={"ID":"ec7aa340-e8e1-48ad-973d-1c951baa3906","Type":"ContainerDied","Data":"5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1"} Sep 29 19:48:24 crc kubenswrapper[4780]: I0929 19:48:24.065956 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vkm5" event={"ID":"ec7aa340-e8e1-48ad-973d-1c951baa3906","Type":"ContainerStarted","Data":"93fcbcf209dd7f6d132b633fcd261eaa559f0fb4564e167b28d7512b5b0b1940"} Sep 29 19:48:25 crc kubenswrapper[4780]: I0929 19:48:25.079849 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vkm5" event={"ID":"ec7aa340-e8e1-48ad-973d-1c951baa3906","Type":"ContainerStarted","Data":"f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f"} Sep 29 19:48:26 crc kubenswrapper[4780]: I0929 19:48:26.091251 4780 generic.go:334] "Generic (PLEG): container finished" podID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerID="f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f" exitCode=0 Sep 29 19:48:26 crc kubenswrapper[4780]: I0929 19:48:26.091317 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vkm5" event={"ID":"ec7aa340-e8e1-48ad-973d-1c951baa3906","Type":"ContainerDied","Data":"f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f"} Sep 29 19:48:27 crc kubenswrapper[4780]: I0929 19:48:27.113414 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vkm5" event={"ID":"ec7aa340-e8e1-48ad-973d-1c951baa3906","Type":"ContainerStarted","Data":"5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55"} Sep 29 19:48:27 crc kubenswrapper[4780]: I0929 19:48:27.148353 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2vkm5" podStartSLOduration=2.489729165 podStartE2EDuration="5.148335276s" podCreationTimestamp="2025-09-29 19:48:22 +0000 UTC" firstStartedPulling="2025-09-29 19:48:24.066469662 +0000 UTC m=+3904.014767746" lastFinishedPulling="2025-09-29 19:48:26.725075743 +0000 UTC m=+3906.673373857" observedRunningTime="2025-09-29 19:48:27.145089654 +0000 UTC m=+3907.093387738" watchObservedRunningTime="2025-09-29 19:48:27.148335276 +0000 UTC m=+3907.096633330" Sep 29 19:48:32 crc kubenswrapper[4780]: I0929 19:48:32.932720 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:32 crc kubenswrapper[4780]: I0929 19:48:32.933422 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:33 crc kubenswrapper[4780]: I0929 19:48:33.010233 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:33 crc kubenswrapper[4780]: I0929 19:48:33.214576 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:33 crc kubenswrapper[4780]: I0929 19:48:33.282641 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2vkm5"] Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.187980 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2vkm5" podUID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerName="registry-server" containerID="cri-o://5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55" gracePeriod=2 Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.684449 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.877276 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-catalog-content\") pod \"ec7aa340-e8e1-48ad-973d-1c951baa3906\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.877358 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-utilities\") pod \"ec7aa340-e8e1-48ad-973d-1c951baa3906\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.877475 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r58j\" (UniqueName: \"kubernetes.io/projected/ec7aa340-e8e1-48ad-973d-1c951baa3906-kube-api-access-5r58j\") pod \"ec7aa340-e8e1-48ad-973d-1c951baa3906\" (UID: \"ec7aa340-e8e1-48ad-973d-1c951baa3906\") " Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.878435 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-utilities" (OuterVolumeSpecName: "utilities") pod "ec7aa340-e8e1-48ad-973d-1c951baa3906" (UID: "ec7aa340-e8e1-48ad-973d-1c951baa3906"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.882381 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec7aa340-e8e1-48ad-973d-1c951baa3906-kube-api-access-5r58j" (OuterVolumeSpecName: "kube-api-access-5r58j") pod "ec7aa340-e8e1-48ad-973d-1c951baa3906" (UID: "ec7aa340-e8e1-48ad-973d-1c951baa3906"). InnerVolumeSpecName "kube-api-access-5r58j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.917226 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec7aa340-e8e1-48ad-973d-1c951baa3906" (UID: "ec7aa340-e8e1-48ad-973d-1c951baa3906"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.978556 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r58j\" (UniqueName: \"kubernetes.io/projected/ec7aa340-e8e1-48ad-973d-1c951baa3906-kube-api-access-5r58j\") on node \"crc\" DevicePath \"\"" Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.978584 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:48:35 crc kubenswrapper[4780]: I0929 19:48:35.978594 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7aa340-e8e1-48ad-973d-1c951baa3906-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.205633 4780 generic.go:334] "Generic (PLEG): container finished" podID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerID="5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55" exitCode=0 Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.205713 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vkm5" event={"ID":"ec7aa340-e8e1-48ad-973d-1c951baa3906","Type":"ContainerDied","Data":"5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55"} Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.205760 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vkm5" event={"ID":"ec7aa340-e8e1-48ad-973d-1c951baa3906","Type":"ContainerDied","Data":"93fcbcf209dd7f6d132b633fcd261eaa559f0fb4564e167b28d7512b5b0b1940"} Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.205794 4780 scope.go:117] "RemoveContainer" containerID="5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.206019 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vkm5" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.238870 4780 scope.go:117] "RemoveContainer" containerID="f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.256812 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2vkm5"] Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.263741 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2vkm5"] Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.279530 4780 scope.go:117] "RemoveContainer" containerID="5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.306630 4780 scope.go:117] "RemoveContainer" containerID="5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55" Sep 29 19:48:36 crc kubenswrapper[4780]: E0929 19:48:36.307163 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55\": container with ID starting with 5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55 not found: ID does not exist" containerID="5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.307227 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55"} err="failed to get container status \"5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55\": rpc error: code = NotFound desc = could not find container \"5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55\": container with ID starting with 5f13884091f5e2ad30b20385b8ad4dbbdb21cd9e414820ecdcc9c802be3f3f55 not found: ID does not exist" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.307266 4780 scope.go:117] "RemoveContainer" containerID="f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f" Sep 29 19:48:36 crc kubenswrapper[4780]: E0929 19:48:36.307803 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f\": container with ID starting with f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f not found: ID does not exist" containerID="f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.307852 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f"} err="failed to get container status \"f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f\": rpc error: code = NotFound desc = could not find container \"f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f\": container with ID starting with f9e80ea5a9cdbc77d89d1a15443a4ec2e4f4834104e3d60c873ce37f0d4a2f2f not found: ID does not exist" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.307881 4780 scope.go:117] "RemoveContainer" containerID="5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1" Sep 29 19:48:36 crc kubenswrapper[4780]: E0929 19:48:36.308380 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1\": container with ID starting with 5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1 not found: ID does not exist" containerID="5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.308458 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1"} err="failed to get container status \"5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1\": rpc error: code = NotFound desc = could not find container \"5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1\": container with ID starting with 5ec857d7312470d5588b3751bdccc9f95beed055eaa6b4cebb3c48e1f44cabc1 not found: ID does not exist" Sep 29 19:48:36 crc kubenswrapper[4780]: I0929 19:48:36.765909 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec7aa340-e8e1-48ad-973d-1c951baa3906" path="/var/lib/kubelet/pods/ec7aa340-e8e1-48ad-973d-1c951baa3906/volumes" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.019985 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f67r9"] Sep 29 19:48:56 crc kubenswrapper[4780]: E0929 19:48:56.021469 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerName="extract-content" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.021496 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerName="extract-content" Sep 29 19:48:56 crc kubenswrapper[4780]: E0929 19:48:56.021539 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerName="extract-utilities" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.021552 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerName="extract-utilities" Sep 29 19:48:56 crc kubenswrapper[4780]: E0929 19:48:56.021573 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerName="registry-server" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.021590 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerName="registry-server" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.021855 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec7aa340-e8e1-48ad-973d-1c951baa3906" containerName="registry-server" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.024923 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.047042 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f67r9"] Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.209875 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44cbv\" (UniqueName: \"kubernetes.io/projected/0e849500-21aa-49cc-a07a-c5a80db65620-kube-api-access-44cbv\") pod \"community-operators-f67r9\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.209995 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-catalog-content\") pod \"community-operators-f67r9\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.210077 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-utilities\") pod \"community-operators-f67r9\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.311551 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-catalog-content\") pod \"community-operators-f67r9\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.311621 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-utilities\") pod \"community-operators-f67r9\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.311713 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44cbv\" (UniqueName: \"kubernetes.io/projected/0e849500-21aa-49cc-a07a-c5a80db65620-kube-api-access-44cbv\") pod \"community-operators-f67r9\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.312560 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-utilities\") pod \"community-operators-f67r9\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.312704 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-catalog-content\") pod \"community-operators-f67r9\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.337818 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44cbv\" (UniqueName: \"kubernetes.io/projected/0e849500-21aa-49cc-a07a-c5a80db65620-kube-api-access-44cbv\") pod \"community-operators-f67r9\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.354633 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:48:56 crc kubenswrapper[4780]: I0929 19:48:56.883713 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f67r9"] Sep 29 19:48:57 crc kubenswrapper[4780]: I0929 19:48:57.407896 4780 generic.go:334] "Generic (PLEG): container finished" podID="0e849500-21aa-49cc-a07a-c5a80db65620" containerID="822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c" exitCode=0 Sep 29 19:48:57 crc kubenswrapper[4780]: I0929 19:48:57.407967 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67r9" event={"ID":"0e849500-21aa-49cc-a07a-c5a80db65620","Type":"ContainerDied","Data":"822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c"} Sep 29 19:48:57 crc kubenswrapper[4780]: I0929 19:48:57.408275 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67r9" event={"ID":"0e849500-21aa-49cc-a07a-c5a80db65620","Type":"ContainerStarted","Data":"9ac16fe568cf5b45d860863244dae4ff470e4048b91fd009efae75bb8d2038d4"} Sep 29 19:48:57 crc kubenswrapper[4780]: I0929 19:48:57.410675 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 19:48:58 crc kubenswrapper[4780]: I0929 19:48:58.421082 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67r9" event={"ID":"0e849500-21aa-49cc-a07a-c5a80db65620","Type":"ContainerStarted","Data":"99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8"} Sep 29 19:48:59 crc kubenswrapper[4780]: I0929 19:48:59.433359 4780 generic.go:334] "Generic (PLEG): container finished" podID="0e849500-21aa-49cc-a07a-c5a80db65620" containerID="99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8" exitCode=0 Sep 29 19:48:59 crc kubenswrapper[4780]: I0929 19:48:59.433461 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67r9" event={"ID":"0e849500-21aa-49cc-a07a-c5a80db65620","Type":"ContainerDied","Data":"99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8"} Sep 29 19:49:00 crc kubenswrapper[4780]: I0929 19:49:00.446918 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67r9" event={"ID":"0e849500-21aa-49cc-a07a-c5a80db65620","Type":"ContainerStarted","Data":"7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312"} Sep 29 19:49:00 crc kubenswrapper[4780]: I0929 19:49:00.483350 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f67r9" podStartSLOduration=2.9336713960000003 podStartE2EDuration="5.483329844s" podCreationTimestamp="2025-09-29 19:48:55 +0000 UTC" firstStartedPulling="2025-09-29 19:48:57.410314822 +0000 UTC m=+3937.358612896" lastFinishedPulling="2025-09-29 19:48:59.9599733 +0000 UTC m=+3939.908271344" observedRunningTime="2025-09-29 19:49:00.480601427 +0000 UTC m=+3940.428899501" watchObservedRunningTime="2025-09-29 19:49:00.483329844 +0000 UTC m=+3940.431627888" Sep 29 19:49:06 crc kubenswrapper[4780]: I0929 19:49:06.355687 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:49:06 crc kubenswrapper[4780]: I0929 19:49:06.358016 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:49:06 crc kubenswrapper[4780]: I0929 19:49:06.421695 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:49:07 crc kubenswrapper[4780]: I0929 19:49:07.194853 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:49:09 crc kubenswrapper[4780]: I0929 19:49:09.603035 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f67r9"] Sep 29 19:49:09 crc kubenswrapper[4780]: I0929 19:49:09.603572 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f67r9" podUID="0e849500-21aa-49cc-a07a-c5a80db65620" containerName="registry-server" containerID="cri-o://7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312" gracePeriod=2 Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.041083 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.142906 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-utilities\") pod \"0e849500-21aa-49cc-a07a-c5a80db65620\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.142977 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-catalog-content\") pod \"0e849500-21aa-49cc-a07a-c5a80db65620\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.143036 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44cbv\" (UniqueName: \"kubernetes.io/projected/0e849500-21aa-49cc-a07a-c5a80db65620-kube-api-access-44cbv\") pod \"0e849500-21aa-49cc-a07a-c5a80db65620\" (UID: \"0e849500-21aa-49cc-a07a-c5a80db65620\") " Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.144285 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-utilities" (OuterVolumeSpecName: "utilities") pod "0e849500-21aa-49cc-a07a-c5a80db65620" (UID: "0e849500-21aa-49cc-a07a-c5a80db65620"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.148126 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e849500-21aa-49cc-a07a-c5a80db65620-kube-api-access-44cbv" (OuterVolumeSpecName: "kube-api-access-44cbv") pod "0e849500-21aa-49cc-a07a-c5a80db65620" (UID: "0e849500-21aa-49cc-a07a-c5a80db65620"). InnerVolumeSpecName "kube-api-access-44cbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.193871 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e849500-21aa-49cc-a07a-c5a80db65620" (UID: "0e849500-21aa-49cc-a07a-c5a80db65620"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.245210 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44cbv\" (UniqueName: \"kubernetes.io/projected/0e849500-21aa-49cc-a07a-c5a80db65620-kube-api-access-44cbv\") on node \"crc\" DevicePath \"\"" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.245236 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.245246 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e849500-21aa-49cc-a07a-c5a80db65620-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.554846 4780 generic.go:334] "Generic (PLEG): container finished" podID="0e849500-21aa-49cc-a07a-c5a80db65620" containerID="7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312" exitCode=0 Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.554910 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67r9" event={"ID":"0e849500-21aa-49cc-a07a-c5a80db65620","Type":"ContainerDied","Data":"7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312"} Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.554963 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f67r9" event={"ID":"0e849500-21aa-49cc-a07a-c5a80db65620","Type":"ContainerDied","Data":"9ac16fe568cf5b45d860863244dae4ff470e4048b91fd009efae75bb8d2038d4"} Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.554998 4780 scope.go:117] "RemoveContainer" containerID="7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.554996 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f67r9" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.587247 4780 scope.go:117] "RemoveContainer" containerID="99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.610913 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f67r9"] Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.621291 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f67r9"] Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.631017 4780 scope.go:117] "RemoveContainer" containerID="822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.670872 4780 scope.go:117] "RemoveContainer" containerID="7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312" Sep 29 19:49:10 crc kubenswrapper[4780]: E0929 19:49:10.671490 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312\": container with ID starting with 7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312 not found: ID does not exist" containerID="7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.671539 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312"} err="failed to get container status \"7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312\": rpc error: code = NotFound desc = could not find container \"7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312\": container with ID starting with 7dff119085615fa457bd99e341ddcb47fe817870eae661696ec89fc20c462312 not found: ID does not exist" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.671573 4780 scope.go:117] "RemoveContainer" containerID="99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8" Sep 29 19:49:10 crc kubenswrapper[4780]: E0929 19:49:10.672350 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8\": container with ID starting with 99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8 not found: ID does not exist" containerID="99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.672512 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8"} err="failed to get container status \"99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8\": rpc error: code = NotFound desc = could not find container \"99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8\": container with ID starting with 99be79fa7df90e12205481799dec4c2e0d69173e86082262e2480bd4ff21d4a8 not found: ID does not exist" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.672599 4780 scope.go:117] "RemoveContainer" containerID="822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c" Sep 29 19:49:10 crc kubenswrapper[4780]: E0929 19:49:10.673362 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c\": container with ID starting with 822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c not found: ID does not exist" containerID="822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.673409 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c"} err="failed to get container status \"822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c\": rpc error: code = NotFound desc = could not find container \"822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c\": container with ID starting with 822e739b69a1fe9e76cc33b66b421ce935871451c74feb9772761d7214ddc57c not found: ID does not exist" Sep 29 19:49:10 crc kubenswrapper[4780]: I0929 19:49:10.769250 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e849500-21aa-49cc-a07a-c5a80db65620" path="/var/lib/kubelet/pods/0e849500-21aa-49cc-a07a-c5a80db65620/volumes" Sep 29 19:50:33 crc kubenswrapper[4780]: I0929 19:50:33.223185 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:50:33 crc kubenswrapper[4780]: I0929 19:50:33.223764 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:51:03 crc kubenswrapper[4780]: I0929 19:51:03.223688 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:51:03 crc kubenswrapper[4780]: I0929 19:51:03.224628 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:51:33 crc kubenswrapper[4780]: I0929 19:51:33.223675 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:51:33 crc kubenswrapper[4780]: I0929 19:51:33.224515 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:51:33 crc kubenswrapper[4780]: I0929 19:51:33.224557 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:51:33 crc kubenswrapper[4780]: I0929 19:51:33.225116 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb5b4c1b5bd8e2acb286e71cc95172b4e26e921668ff5e3ae3857289dd03f33c"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:51:33 crc kubenswrapper[4780]: I0929 19:51:33.225186 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://eb5b4c1b5bd8e2acb286e71cc95172b4e26e921668ff5e3ae3857289dd03f33c" gracePeriod=600 Sep 29 19:51:33 crc kubenswrapper[4780]: I0929 19:51:33.917662 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="eb5b4c1b5bd8e2acb286e71cc95172b4e26e921668ff5e3ae3857289dd03f33c" exitCode=0 Sep 29 19:51:33 crc kubenswrapper[4780]: I0929 19:51:33.917724 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"eb5b4c1b5bd8e2acb286e71cc95172b4e26e921668ff5e3ae3857289dd03f33c"} Sep 29 19:51:33 crc kubenswrapper[4780]: I0929 19:51:33.918024 4780 scope.go:117] "RemoveContainer" containerID="9f8af67243db53af5986f58307f7d887a8f0f86d0cef0eb2a0099665f9357a95" Sep 29 19:51:34 crc kubenswrapper[4780]: I0929 19:51:34.928903 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb"} Sep 29 19:54:03 crc kubenswrapper[4780]: I0929 19:54:03.223361 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:54:03 crc kubenswrapper[4780]: I0929 19:54:03.224322 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.926699 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-czccr"] Sep 29 19:54:04 crc kubenswrapper[4780]: E0929 19:54:04.927183 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e849500-21aa-49cc-a07a-c5a80db65620" containerName="extract-utilities" Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.927202 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e849500-21aa-49cc-a07a-c5a80db65620" containerName="extract-utilities" Sep 29 19:54:04 crc kubenswrapper[4780]: E0929 19:54:04.927233 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e849500-21aa-49cc-a07a-c5a80db65620" containerName="registry-server" Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.927244 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e849500-21aa-49cc-a07a-c5a80db65620" containerName="registry-server" Sep 29 19:54:04 crc kubenswrapper[4780]: E0929 19:54:04.927272 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e849500-21aa-49cc-a07a-c5a80db65620" containerName="extract-content" Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.927283 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e849500-21aa-49cc-a07a-c5a80db65620" containerName="extract-content" Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.927513 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e849500-21aa-49cc-a07a-c5a80db65620" containerName="registry-server" Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.929156 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.940964 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-czccr"] Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.949588 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-utilities\") pod \"redhat-operators-czccr\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.949685 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf9cq\" (UniqueName: \"kubernetes.io/projected/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-kube-api-access-lf9cq\") pod \"redhat-operators-czccr\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:04 crc kubenswrapper[4780]: I0929 19:54:04.949735 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-catalog-content\") pod \"redhat-operators-czccr\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:05 crc kubenswrapper[4780]: I0929 19:54:05.050840 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-utilities\") pod \"redhat-operators-czccr\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:05 crc kubenswrapper[4780]: I0929 19:54:05.050925 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf9cq\" (UniqueName: \"kubernetes.io/projected/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-kube-api-access-lf9cq\") pod \"redhat-operators-czccr\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:05 crc kubenswrapper[4780]: I0929 19:54:05.050956 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-catalog-content\") pod \"redhat-operators-czccr\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:05 crc kubenswrapper[4780]: I0929 19:54:05.051481 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-catalog-content\") pod \"redhat-operators-czccr\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:05 crc kubenswrapper[4780]: I0929 19:54:05.051483 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-utilities\") pod \"redhat-operators-czccr\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:05 crc kubenswrapper[4780]: I0929 19:54:05.074809 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf9cq\" (UniqueName: \"kubernetes.io/projected/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-kube-api-access-lf9cq\") pod \"redhat-operators-czccr\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:05 crc kubenswrapper[4780]: I0929 19:54:05.266515 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:05 crc kubenswrapper[4780]: I0929 19:54:05.775112 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-czccr"] Sep 29 19:54:06 crc kubenswrapper[4780]: I0929 19:54:06.401102 4780 generic.go:334] "Generic (PLEG): container finished" podID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerID="2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446" exitCode=0 Sep 29 19:54:06 crc kubenswrapper[4780]: I0929 19:54:06.401159 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czccr" event={"ID":"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83","Type":"ContainerDied","Data":"2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446"} Sep 29 19:54:06 crc kubenswrapper[4780]: I0929 19:54:06.401191 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czccr" event={"ID":"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83","Type":"ContainerStarted","Data":"c6e7a4a820b1e95e3f74ad521ed9e3c4c87cfc4d4cbdc35aa40d7e6f44284b11"} Sep 29 19:54:06 crc kubenswrapper[4780]: I0929 19:54:06.405365 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 19:54:07 crc kubenswrapper[4780]: I0929 19:54:07.412753 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czccr" event={"ID":"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83","Type":"ContainerStarted","Data":"fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c"} Sep 29 19:54:08 crc kubenswrapper[4780]: I0929 19:54:08.424125 4780 generic.go:334] "Generic (PLEG): container finished" podID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerID="fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c" exitCode=0 Sep 29 19:54:08 crc kubenswrapper[4780]: I0929 19:54:08.424223 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czccr" event={"ID":"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83","Type":"ContainerDied","Data":"fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c"} Sep 29 19:54:09 crc kubenswrapper[4780]: I0929 19:54:09.438131 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czccr" event={"ID":"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83","Type":"ContainerStarted","Data":"c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea"} Sep 29 19:54:09 crc kubenswrapper[4780]: I0929 19:54:09.472648 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-czccr" podStartSLOduration=3.049443166 podStartE2EDuration="5.472616193s" podCreationTimestamp="2025-09-29 19:54:04 +0000 UTC" firstStartedPulling="2025-09-29 19:54:06.40450683 +0000 UTC m=+4246.352804894" lastFinishedPulling="2025-09-29 19:54:08.827679807 +0000 UTC m=+4248.775977921" observedRunningTime="2025-09-29 19:54:09.465285255 +0000 UTC m=+4249.413583319" watchObservedRunningTime="2025-09-29 19:54:09.472616193 +0000 UTC m=+4249.420914277" Sep 29 19:54:15 crc kubenswrapper[4780]: I0929 19:54:15.267457 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:15 crc kubenswrapper[4780]: I0929 19:54:15.268102 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:15 crc kubenswrapper[4780]: I0929 19:54:15.342890 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:15 crc kubenswrapper[4780]: I0929 19:54:15.565543 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:15 crc kubenswrapper[4780]: I0929 19:54:15.630768 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-czccr"] Sep 29 19:54:17 crc kubenswrapper[4780]: I0929 19:54:17.508392 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-czccr" podUID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerName="registry-server" containerID="cri-o://c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea" gracePeriod=2 Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.003825 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.159089 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-catalog-content\") pod \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.159279 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-utilities\") pod \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.159326 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf9cq\" (UniqueName: \"kubernetes.io/projected/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-kube-api-access-lf9cq\") pod \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\" (UID: \"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83\") " Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.161136 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-utilities" (OuterVolumeSpecName: "utilities") pod "d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" (UID: "d81ddbfb-6b84-4cf2-b610-4fe1607b1f83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.174460 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-kube-api-access-lf9cq" (OuterVolumeSpecName: "kube-api-access-lf9cq") pod "d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" (UID: "d81ddbfb-6b84-4cf2-b610-4fe1607b1f83"). InnerVolumeSpecName "kube-api-access-lf9cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.261935 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.261996 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf9cq\" (UniqueName: \"kubernetes.io/projected/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-kube-api-access-lf9cq\") on node \"crc\" DevicePath \"\"" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.520671 4780 generic.go:334] "Generic (PLEG): container finished" podID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerID="c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea" exitCode=0 Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.520736 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czccr" event={"ID":"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83","Type":"ContainerDied","Data":"c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea"} Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.520777 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czccr" event={"ID":"d81ddbfb-6b84-4cf2-b610-4fe1607b1f83","Type":"ContainerDied","Data":"c6e7a4a820b1e95e3f74ad521ed9e3c4c87cfc4d4cbdc35aa40d7e6f44284b11"} Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.520809 4780 scope.go:117] "RemoveContainer" containerID="c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.520989 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czccr" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.544214 4780 scope.go:117] "RemoveContainer" containerID="fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.563380 4780 scope.go:117] "RemoveContainer" containerID="2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.589140 4780 scope.go:117] "RemoveContainer" containerID="c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea" Sep 29 19:54:18 crc kubenswrapper[4780]: E0929 19:54:18.589739 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea\": container with ID starting with c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea not found: ID does not exist" containerID="c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.589789 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea"} err="failed to get container status \"c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea\": rpc error: code = NotFound desc = could not find container \"c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea\": container with ID starting with c2cab476bed17a3c2a934f5b0d903504d3c8e5010a66b2811176b087ddcf5cea not found: ID does not exist" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.589824 4780 scope.go:117] "RemoveContainer" containerID="fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c" Sep 29 19:54:18 crc kubenswrapper[4780]: E0929 19:54:18.590431 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c\": container with ID starting with fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c not found: ID does not exist" containerID="fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.590472 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c"} err="failed to get container status \"fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c\": rpc error: code = NotFound desc = could not find container \"fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c\": container with ID starting with fb520b987b4511b4d7f88e8887cc8ac70065c3951a805cdbd31fa50cf378493c not found: ID does not exist" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.590500 4780 scope.go:117] "RemoveContainer" containerID="2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446" Sep 29 19:54:18 crc kubenswrapper[4780]: E0929 19:54:18.590820 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446\": container with ID starting with 2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446 not found: ID does not exist" containerID="2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.590877 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446"} err="failed to get container status \"2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446\": rpc error: code = NotFound desc = could not find container \"2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446\": container with ID starting with 2f47de613dce7b14127956c38b557a112df5ca2cef07b37b5ec27386412f3446 not found: ID does not exist" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.708260 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" (UID: "d81ddbfb-6b84-4cf2-b610-4fe1607b1f83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.769792 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.858528 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-czccr"] Sep 29 19:54:18 crc kubenswrapper[4780]: I0929 19:54:18.868104 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-czccr"] Sep 29 19:54:20 crc kubenswrapper[4780]: I0929 19:54:20.772849 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" path="/var/lib/kubelet/pods/d81ddbfb-6b84-4cf2-b610-4fe1607b1f83/volumes" Sep 29 19:54:33 crc kubenswrapper[4780]: I0929 19:54:33.223315 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:54:33 crc kubenswrapper[4780]: I0929 19:54:33.224133 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:55:03 crc kubenswrapper[4780]: I0929 19:55:03.223478 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 19:55:03 crc kubenswrapper[4780]: I0929 19:55:03.224427 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 19:55:03 crc kubenswrapper[4780]: I0929 19:55:03.224486 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 19:55:03 crc kubenswrapper[4780]: I0929 19:55:03.225464 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 19:55:03 crc kubenswrapper[4780]: I0929 19:55:03.225629 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" gracePeriod=600 Sep 29 19:55:03 crc kubenswrapper[4780]: E0929 19:55:03.355394 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:55:03 crc kubenswrapper[4780]: I0929 19:55:03.965335 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" exitCode=0 Sep 29 19:55:03 crc kubenswrapper[4780]: I0929 19:55:03.965454 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb"} Sep 29 19:55:03 crc kubenswrapper[4780]: I0929 19:55:03.965728 4780 scope.go:117] "RemoveContainer" containerID="eb5b4c1b5bd8e2acb286e71cc95172b4e26e921668ff5e3ae3857289dd03f33c" Sep 29 19:55:03 crc kubenswrapper[4780]: I0929 19:55:03.966341 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:55:03 crc kubenswrapper[4780]: E0929 19:55:03.966721 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:55:14 crc kubenswrapper[4780]: I0929 19:55:14.753842 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:55:14 crc kubenswrapper[4780]: E0929 19:55:14.754631 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.458628 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dpx9z"] Sep 29 19:55:24 crc kubenswrapper[4780]: E0929 19:55:24.459909 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerName="registry-server" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.459935 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerName="registry-server" Sep 29 19:55:24 crc kubenswrapper[4780]: E0929 19:55:24.459973 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerName="extract-utilities" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.459988 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerName="extract-utilities" Sep 29 19:55:24 crc kubenswrapper[4780]: E0929 19:55:24.460027 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerName="extract-content" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.460041 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerName="extract-content" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.460443 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d81ddbfb-6b84-4cf2-b610-4fe1607b1f83" containerName="registry-server" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.471813 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.475520 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpx9z"] Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.531593 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l675\" (UniqueName: \"kubernetes.io/projected/ac525162-2a7f-42e6-a5b5-ec34984237b3-kube-api-access-4l675\") pod \"redhat-marketplace-dpx9z\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.531714 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-catalog-content\") pod \"redhat-marketplace-dpx9z\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.531777 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-utilities\") pod \"redhat-marketplace-dpx9z\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.632560 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-catalog-content\") pod \"redhat-marketplace-dpx9z\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.632686 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-utilities\") pod \"redhat-marketplace-dpx9z\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.632779 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l675\" (UniqueName: \"kubernetes.io/projected/ac525162-2a7f-42e6-a5b5-ec34984237b3-kube-api-access-4l675\") pod \"redhat-marketplace-dpx9z\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.633202 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-catalog-content\") pod \"redhat-marketplace-dpx9z\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.634085 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-utilities\") pod \"redhat-marketplace-dpx9z\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.683200 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l675\" (UniqueName: \"kubernetes.io/projected/ac525162-2a7f-42e6-a5b5-ec34984237b3-kube-api-access-4l675\") pod \"redhat-marketplace-dpx9z\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:24 crc kubenswrapper[4780]: I0929 19:55:24.805447 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:25 crc kubenswrapper[4780]: I0929 19:55:25.059315 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpx9z"] Sep 29 19:55:25 crc kubenswrapper[4780]: W0929 19:55:25.068157 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac525162_2a7f_42e6_a5b5_ec34984237b3.slice/crio-8c00530ed4cb226284dfe410a37e05776b0385c6e67053fb6bbea23fd921e28e WatchSource:0}: Error finding container 8c00530ed4cb226284dfe410a37e05776b0385c6e67053fb6bbea23fd921e28e: Status 404 returned error can't find the container with id 8c00530ed4cb226284dfe410a37e05776b0385c6e67053fb6bbea23fd921e28e Sep 29 19:55:25 crc kubenswrapper[4780]: I0929 19:55:25.174599 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpx9z" event={"ID":"ac525162-2a7f-42e6-a5b5-ec34984237b3","Type":"ContainerStarted","Data":"8c00530ed4cb226284dfe410a37e05776b0385c6e67053fb6bbea23fd921e28e"} Sep 29 19:55:26 crc kubenswrapper[4780]: I0929 19:55:26.186241 4780 generic.go:334] "Generic (PLEG): container finished" podID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerID="b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e" exitCode=0 Sep 29 19:55:26 crc kubenswrapper[4780]: I0929 19:55:26.186311 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpx9z" event={"ID":"ac525162-2a7f-42e6-a5b5-ec34984237b3","Type":"ContainerDied","Data":"b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e"} Sep 29 19:55:27 crc kubenswrapper[4780]: I0929 19:55:27.194805 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpx9z" event={"ID":"ac525162-2a7f-42e6-a5b5-ec34984237b3","Type":"ContainerStarted","Data":"c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f"} Sep 29 19:55:27 crc kubenswrapper[4780]: I0929 19:55:27.753872 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:55:27 crc kubenswrapper[4780]: E0929 19:55:27.754503 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:55:28 crc kubenswrapper[4780]: I0929 19:55:28.207994 4780 generic.go:334] "Generic (PLEG): container finished" podID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerID="c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f" exitCode=0 Sep 29 19:55:28 crc kubenswrapper[4780]: I0929 19:55:28.208981 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpx9z" event={"ID":"ac525162-2a7f-42e6-a5b5-ec34984237b3","Type":"ContainerDied","Data":"c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f"} Sep 29 19:55:29 crc kubenswrapper[4780]: I0929 19:55:29.225203 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpx9z" event={"ID":"ac525162-2a7f-42e6-a5b5-ec34984237b3","Type":"ContainerStarted","Data":"aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2"} Sep 29 19:55:29 crc kubenswrapper[4780]: I0929 19:55:29.262244 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dpx9z" podStartSLOduration=2.8351060710000002 podStartE2EDuration="5.262219161s" podCreationTimestamp="2025-09-29 19:55:24 +0000 UTC" firstStartedPulling="2025-09-29 19:55:26.189369063 +0000 UTC m=+4326.137667147" lastFinishedPulling="2025-09-29 19:55:28.616482163 +0000 UTC m=+4328.564780237" observedRunningTime="2025-09-29 19:55:29.252430093 +0000 UTC m=+4329.200728207" watchObservedRunningTime="2025-09-29 19:55:29.262219161 +0000 UTC m=+4329.210517235" Sep 29 19:55:34 crc kubenswrapper[4780]: I0929 19:55:34.805682 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:34 crc kubenswrapper[4780]: I0929 19:55:34.806165 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:34 crc kubenswrapper[4780]: I0929 19:55:34.874716 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:35 crc kubenswrapper[4780]: I0929 19:55:35.355446 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.448547 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpx9z"] Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.448927 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dpx9z" podUID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerName="registry-server" containerID="cri-o://aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2" gracePeriod=2 Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.753567 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:55:38 crc kubenswrapper[4780]: E0929 19:55:38.754514 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.958445 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.976591 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-catalog-content\") pod \"ac525162-2a7f-42e6-a5b5-ec34984237b3\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.976776 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l675\" (UniqueName: \"kubernetes.io/projected/ac525162-2a7f-42e6-a5b5-ec34984237b3-kube-api-access-4l675\") pod \"ac525162-2a7f-42e6-a5b5-ec34984237b3\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.976799 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-utilities\") pod \"ac525162-2a7f-42e6-a5b5-ec34984237b3\" (UID: \"ac525162-2a7f-42e6-a5b5-ec34984237b3\") " Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.977771 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-utilities" (OuterVolumeSpecName: "utilities") pod "ac525162-2a7f-42e6-a5b5-ec34984237b3" (UID: "ac525162-2a7f-42e6-a5b5-ec34984237b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.987869 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac525162-2a7f-42e6-a5b5-ec34984237b3-kube-api-access-4l675" (OuterVolumeSpecName: "kube-api-access-4l675") pod "ac525162-2a7f-42e6-a5b5-ec34984237b3" (UID: "ac525162-2a7f-42e6-a5b5-ec34984237b3"). InnerVolumeSpecName "kube-api-access-4l675". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 19:55:38 crc kubenswrapper[4780]: I0929 19:55:38.994619 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac525162-2a7f-42e6-a5b5-ec34984237b3" (UID: "ac525162-2a7f-42e6-a5b5-ec34984237b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.079238 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l675\" (UniqueName: \"kubernetes.io/projected/ac525162-2a7f-42e6-a5b5-ec34984237b3-kube-api-access-4l675\") on node \"crc\" DevicePath \"\"" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.079293 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.079311 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac525162-2a7f-42e6-a5b5-ec34984237b3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.320374 4780 generic.go:334] "Generic (PLEG): container finished" podID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerID="aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2" exitCode=0 Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.320488 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpx9z" event={"ID":"ac525162-2a7f-42e6-a5b5-ec34984237b3","Type":"ContainerDied","Data":"aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2"} Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.320462 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpx9z" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.320562 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpx9z" event={"ID":"ac525162-2a7f-42e6-a5b5-ec34984237b3","Type":"ContainerDied","Data":"8c00530ed4cb226284dfe410a37e05776b0385c6e67053fb6bbea23fd921e28e"} Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.320605 4780 scope.go:117] "RemoveContainer" containerID="aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.346000 4780 scope.go:117] "RemoveContainer" containerID="c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.377346 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpx9z"] Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.383941 4780 scope.go:117] "RemoveContainer" containerID="b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.386406 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpx9z"] Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.415406 4780 scope.go:117] "RemoveContainer" containerID="aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2" Sep 29 19:55:39 crc kubenswrapper[4780]: E0929 19:55:39.416428 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2\": container with ID starting with aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2 not found: ID does not exist" containerID="aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.416505 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2"} err="failed to get container status \"aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2\": rpc error: code = NotFound desc = could not find container \"aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2\": container with ID starting with aebad07f7fa254d03f307f7f411a0dfd349b6ca5a6ac26f88bb7a82384631dc2 not found: ID does not exist" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.416554 4780 scope.go:117] "RemoveContainer" containerID="c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f" Sep 29 19:55:39 crc kubenswrapper[4780]: E0929 19:55:39.417025 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f\": container with ID starting with c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f not found: ID does not exist" containerID="c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.417098 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f"} err="failed to get container status \"c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f\": rpc error: code = NotFound desc = could not find container \"c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f\": container with ID starting with c6bf8654c8e042207f8617ade32a1b6e9f29d4558297f4175b5d59cf87bbe22f not found: ID does not exist" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.417136 4780 scope.go:117] "RemoveContainer" containerID="b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e" Sep 29 19:55:39 crc kubenswrapper[4780]: E0929 19:55:39.417469 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e\": container with ID starting with b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e not found: ID does not exist" containerID="b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e" Sep 29 19:55:39 crc kubenswrapper[4780]: I0929 19:55:39.417517 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e"} err="failed to get container status \"b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e\": rpc error: code = NotFound desc = could not find container \"b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e\": container with ID starting with b2e1614ddb0eed8f619757054ec451cd83390f2800b7ed45d9c92684034a4d6e not found: ID does not exist" Sep 29 19:55:40 crc kubenswrapper[4780]: I0929 19:55:40.767108 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac525162-2a7f-42e6-a5b5-ec34984237b3" path="/var/lib/kubelet/pods/ac525162-2a7f-42e6-a5b5-ec34984237b3/volumes" Sep 29 19:55:52 crc kubenswrapper[4780]: I0929 19:55:52.760957 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:55:52 crc kubenswrapper[4780]: E0929 19:55:52.761960 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:56:05 crc kubenswrapper[4780]: I0929 19:56:05.753842 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:56:05 crc kubenswrapper[4780]: E0929 19:56:05.755192 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:56:18 crc kubenswrapper[4780]: I0929 19:56:18.753248 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:56:18 crc kubenswrapper[4780]: E0929 19:56:18.753995 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:56:31 crc kubenswrapper[4780]: I0929 19:56:31.753955 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:56:31 crc kubenswrapper[4780]: E0929 19:56:31.755265 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:56:46 crc kubenswrapper[4780]: I0929 19:56:46.753867 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:56:46 crc kubenswrapper[4780]: E0929 19:56:46.754882 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:56:59 crc kubenswrapper[4780]: I0929 19:56:59.753409 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:56:59 crc kubenswrapper[4780]: E0929 19:56:59.754323 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:57:12 crc kubenswrapper[4780]: I0929 19:57:12.753432 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:57:12 crc kubenswrapper[4780]: E0929 19:57:12.754366 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:57:25 crc kubenswrapper[4780]: I0929 19:57:25.753013 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:57:25 crc kubenswrapper[4780]: E0929 19:57:25.754019 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:57:37 crc kubenswrapper[4780]: I0929 19:57:37.753768 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:57:37 crc kubenswrapper[4780]: E0929 19:57:37.754662 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:57:50 crc kubenswrapper[4780]: I0929 19:57:50.763333 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:57:50 crc kubenswrapper[4780]: E0929 19:57:50.765014 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:58:05 crc kubenswrapper[4780]: I0929 19:58:05.754437 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:58:05 crc kubenswrapper[4780]: E0929 19:58:05.755660 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:58:19 crc kubenswrapper[4780]: I0929 19:58:19.753314 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:58:19 crc kubenswrapper[4780]: E0929 19:58:19.754446 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:58:33 crc kubenswrapper[4780]: I0929 19:58:33.753909 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:58:33 crc kubenswrapper[4780]: E0929 19:58:33.755124 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:58:48 crc kubenswrapper[4780]: I0929 19:58:48.753217 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:58:48 crc kubenswrapper[4780]: E0929 19:58:48.754183 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:59:01 crc kubenswrapper[4780]: I0929 19:59:01.753671 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:59:01 crc kubenswrapper[4780]: E0929 19:59:01.754914 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:59:12 crc kubenswrapper[4780]: I0929 19:59:12.754025 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:59:12 crc kubenswrapper[4780]: E0929 19:59:12.755335 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:59:23 crc kubenswrapper[4780]: I0929 19:59:23.753247 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:59:23 crc kubenswrapper[4780]: E0929 19:59:23.754294 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:59:35 crc kubenswrapper[4780]: I0929 19:59:35.753642 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:59:35 crc kubenswrapper[4780]: E0929 19:59:35.754961 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 19:59:50 crc kubenswrapper[4780]: I0929 19:59:50.758178 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 19:59:50 crc kubenswrapper[4780]: E0929 19:59:50.758729 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.160073 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps"] Sep 29 20:00:00 crc kubenswrapper[4780]: E0929 20:00:00.188856 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerName="extract-utilities" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.188916 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerName="extract-utilities" Sep 29 20:00:00 crc kubenswrapper[4780]: E0929 20:00:00.188967 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerName="registry-server" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.188982 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerName="registry-server" Sep 29 20:00:00 crc kubenswrapper[4780]: E0929 20:00:00.189025 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerName="extract-content" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.189038 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerName="extract-content" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.189689 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac525162-2a7f-42e6-a5b5-ec34984237b3" containerName="registry-server" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.190863 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps"] Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.191026 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.193254 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.193344 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.365414 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66e2f63-1585-47a8-85f9-05a7173c9e01-config-volume\") pod \"collect-profiles-29319600-fgnps\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.365624 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpw7s\" (UniqueName: \"kubernetes.io/projected/d66e2f63-1585-47a8-85f9-05a7173c9e01-kube-api-access-qpw7s\") pod \"collect-profiles-29319600-fgnps\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.366159 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66e2f63-1585-47a8-85f9-05a7173c9e01-secret-volume\") pod \"collect-profiles-29319600-fgnps\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.468234 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66e2f63-1585-47a8-85f9-05a7173c9e01-secret-volume\") pod \"collect-profiles-29319600-fgnps\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.468333 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66e2f63-1585-47a8-85f9-05a7173c9e01-config-volume\") pod \"collect-profiles-29319600-fgnps\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.468403 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpw7s\" (UniqueName: \"kubernetes.io/projected/d66e2f63-1585-47a8-85f9-05a7173c9e01-kube-api-access-qpw7s\") pod \"collect-profiles-29319600-fgnps\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.470784 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66e2f63-1585-47a8-85f9-05a7173c9e01-config-volume\") pod \"collect-profiles-29319600-fgnps\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.482885 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66e2f63-1585-47a8-85f9-05a7173c9e01-secret-volume\") pod \"collect-profiles-29319600-fgnps\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.505965 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpw7s\" (UniqueName: \"kubernetes.io/projected/d66e2f63-1585-47a8-85f9-05a7173c9e01-kube-api-access-qpw7s\") pod \"collect-profiles-29319600-fgnps\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.522849 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:00 crc kubenswrapper[4780]: I0929 20:00:00.808192 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps"] Sep 29 20:00:01 crc kubenswrapper[4780]: I0929 20:00:01.753676 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 20:00:01 crc kubenswrapper[4780]: E0929 20:00:01.754642 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:00:01 crc kubenswrapper[4780]: I0929 20:00:01.820973 4780 generic.go:334] "Generic (PLEG): container finished" podID="d66e2f63-1585-47a8-85f9-05a7173c9e01" containerID="7b7399bf46013bd7afadfbc980eea598f26a034a8c6efc45ddb2e6129eff5966" exitCode=0 Sep 29 20:00:01 crc kubenswrapper[4780]: I0929 20:00:01.821028 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" event={"ID":"d66e2f63-1585-47a8-85f9-05a7173c9e01","Type":"ContainerDied","Data":"7b7399bf46013bd7afadfbc980eea598f26a034a8c6efc45ddb2e6129eff5966"} Sep 29 20:00:01 crc kubenswrapper[4780]: I0929 20:00:01.821101 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" event={"ID":"d66e2f63-1585-47a8-85f9-05a7173c9e01","Type":"ContainerStarted","Data":"35c8be6f32ddb2c5577d63b1c4943620eaa58104d08a257e51699ff65f239592"} Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.165728 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.312458 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66e2f63-1585-47a8-85f9-05a7173c9e01-secret-volume\") pod \"d66e2f63-1585-47a8-85f9-05a7173c9e01\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.312584 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66e2f63-1585-47a8-85f9-05a7173c9e01-config-volume\") pod \"d66e2f63-1585-47a8-85f9-05a7173c9e01\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.312706 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpw7s\" (UniqueName: \"kubernetes.io/projected/d66e2f63-1585-47a8-85f9-05a7173c9e01-kube-api-access-qpw7s\") pod \"d66e2f63-1585-47a8-85f9-05a7173c9e01\" (UID: \"d66e2f63-1585-47a8-85f9-05a7173c9e01\") " Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.317196 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d66e2f63-1585-47a8-85f9-05a7173c9e01-config-volume" (OuterVolumeSpecName: "config-volume") pod "d66e2f63-1585-47a8-85f9-05a7173c9e01" (UID: "d66e2f63-1585-47a8-85f9-05a7173c9e01"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.321664 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66e2f63-1585-47a8-85f9-05a7173c9e01-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d66e2f63-1585-47a8-85f9-05a7173c9e01" (UID: "d66e2f63-1585-47a8-85f9-05a7173c9e01"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.323710 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66e2f63-1585-47a8-85f9-05a7173c9e01-kube-api-access-qpw7s" (OuterVolumeSpecName: "kube-api-access-qpw7s") pod "d66e2f63-1585-47a8-85f9-05a7173c9e01" (UID: "d66e2f63-1585-47a8-85f9-05a7173c9e01"). InnerVolumeSpecName "kube-api-access-qpw7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.415272 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpw7s\" (UniqueName: \"kubernetes.io/projected/d66e2f63-1585-47a8-85f9-05a7173c9e01-kube-api-access-qpw7s\") on node \"crc\" DevicePath \"\"" Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.415319 4780 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d66e2f63-1585-47a8-85f9-05a7173c9e01-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.415341 4780 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d66e2f63-1585-47a8-85f9-05a7173c9e01-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.851401 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" event={"ID":"d66e2f63-1585-47a8-85f9-05a7173c9e01","Type":"ContainerDied","Data":"35c8be6f32ddb2c5577d63b1c4943620eaa58104d08a257e51699ff65f239592"} Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.851460 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35c8be6f32ddb2c5577d63b1c4943620eaa58104d08a257e51699ff65f239592" Sep 29 20:00:03 crc kubenswrapper[4780]: I0929 20:00:03.851506 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319600-fgnps" Sep 29 20:00:04 crc kubenswrapper[4780]: I0929 20:00:04.272574 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq"] Sep 29 20:00:04 crc kubenswrapper[4780]: I0929 20:00:04.281380 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319555-9tlfq"] Sep 29 20:00:04 crc kubenswrapper[4780]: I0929 20:00:04.766466 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eac5c32-b786-4c2a-a97a-ba4400d3a3b0" path="/var/lib/kubelet/pods/4eac5c32-b786-4c2a-a97a-ba4400d3a3b0/volumes" Sep 29 20:00:13 crc kubenswrapper[4780]: I0929 20:00:13.754022 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 20:00:14 crc kubenswrapper[4780]: I0929 20:00:14.951534 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"dfb132c59b3079e843ab5ef45babfca3451e0db3c6b14052465d4ce66e9d1239"} Sep 29 20:00:27 crc kubenswrapper[4780]: I0929 20:00:27.065102 4780 scope.go:117] "RemoveContainer" containerID="0f2249c0677d88700c503b59bc6a9fddd94a5e159fe8eb6138551bbc907b4659" Sep 29 20:02:33 crc kubenswrapper[4780]: I0929 20:02:33.224100 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:02:33 crc kubenswrapper[4780]: I0929 20:02:33.224746 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:03:03 crc kubenswrapper[4780]: I0929 20:03:03.223803 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:03:03 crc kubenswrapper[4780]: I0929 20:03:03.224635 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.292743 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jxt77"] Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.300839 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jxt77"] Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.422934 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-lp28f"] Sep 29 20:03:16 crc kubenswrapper[4780]: E0929 20:03:16.423446 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66e2f63-1585-47a8-85f9-05a7173c9e01" containerName="collect-profiles" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.423476 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66e2f63-1585-47a8-85f9-05a7173c9e01" containerName="collect-profiles" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.423777 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66e2f63-1585-47a8-85f9-05a7173c9e01" containerName="collect-profiles" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.424574 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.427603 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.427853 4780 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-76nlq" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.428275 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.429918 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.434623 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lp28f"] Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.488568 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/47af1407-6bc3-412b-b441-2e0c473eb9a9-node-mnt\") pod \"crc-storage-crc-lp28f\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.488711 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/47af1407-6bc3-412b-b441-2e0c473eb9a9-crc-storage\") pod \"crc-storage-crc-lp28f\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.488883 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmww\" (UniqueName: \"kubernetes.io/projected/47af1407-6bc3-412b-b441-2e0c473eb9a9-kube-api-access-4hmww\") pod \"crc-storage-crc-lp28f\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.590877 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/47af1407-6bc3-412b-b441-2e0c473eb9a9-crc-storage\") pod \"crc-storage-crc-lp28f\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.591295 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmww\" (UniqueName: \"kubernetes.io/projected/47af1407-6bc3-412b-b441-2e0c473eb9a9-kube-api-access-4hmww\") pod \"crc-storage-crc-lp28f\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.591431 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/47af1407-6bc3-412b-b441-2e0c473eb9a9-node-mnt\") pod \"crc-storage-crc-lp28f\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.591802 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/47af1407-6bc3-412b-b441-2e0c473eb9a9-node-mnt\") pod \"crc-storage-crc-lp28f\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.592840 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/47af1407-6bc3-412b-b441-2e0c473eb9a9-crc-storage\") pod \"crc-storage-crc-lp28f\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.628595 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmww\" (UniqueName: \"kubernetes.io/projected/47af1407-6bc3-412b-b441-2e0c473eb9a9-kube-api-access-4hmww\") pod \"crc-storage-crc-lp28f\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.758344 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:16 crc kubenswrapper[4780]: I0929 20:03:16.767777 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed5c19d4-0266-4678-91ba-a446f91369b1" path="/var/lib/kubelet/pods/ed5c19d4-0266-4678-91ba-a446f91369b1/volumes" Sep 29 20:03:17 crc kubenswrapper[4780]: I0929 20:03:17.102292 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lp28f"] Sep 29 20:03:17 crc kubenswrapper[4780]: I0929 20:03:17.120877 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 20:03:17 crc kubenswrapper[4780]: I0929 20:03:17.736458 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lp28f" event={"ID":"47af1407-6bc3-412b-b441-2e0c473eb9a9","Type":"ContainerStarted","Data":"431818c4e888f341cc8b9250509752d19c5919c55d8e54fce4191a48a36eb1b2"} Sep 29 20:03:18 crc kubenswrapper[4780]: I0929 20:03:18.752100 4780 generic.go:334] "Generic (PLEG): container finished" podID="47af1407-6bc3-412b-b441-2e0c473eb9a9" containerID="20e896b97f0a24815659c6a08319ddca461008a924b321237ffb98cf131324cc" exitCode=0 Sep 29 20:03:18 crc kubenswrapper[4780]: I0929 20:03:18.772548 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lp28f" event={"ID":"47af1407-6bc3-412b-b441-2e0c473eb9a9","Type":"ContainerDied","Data":"20e896b97f0a24815659c6a08319ddca461008a924b321237ffb98cf131324cc"} Sep 29 20:03:20 crc kubenswrapper[4780]: I0929 20:03:20.804006 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lp28f" event={"ID":"47af1407-6bc3-412b-b441-2e0c473eb9a9","Type":"ContainerDied","Data":"431818c4e888f341cc8b9250509752d19c5919c55d8e54fce4191a48a36eb1b2"} Sep 29 20:03:20 crc kubenswrapper[4780]: I0929 20:03:20.804449 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="431818c4e888f341cc8b9250509752d19c5919c55d8e54fce4191a48a36eb1b2" Sep 29 20:03:20 crc kubenswrapper[4780]: I0929 20:03:20.959768 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.062152 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/47af1407-6bc3-412b-b441-2e0c473eb9a9-crc-storage\") pod \"47af1407-6bc3-412b-b441-2e0c473eb9a9\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.062264 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/47af1407-6bc3-412b-b441-2e0c473eb9a9-node-mnt\") pod \"47af1407-6bc3-412b-b441-2e0c473eb9a9\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.062296 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hmww\" (UniqueName: \"kubernetes.io/projected/47af1407-6bc3-412b-b441-2e0c473eb9a9-kube-api-access-4hmww\") pod \"47af1407-6bc3-412b-b441-2e0c473eb9a9\" (UID: \"47af1407-6bc3-412b-b441-2e0c473eb9a9\") " Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.062402 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47af1407-6bc3-412b-b441-2e0c473eb9a9-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "47af1407-6bc3-412b-b441-2e0c473eb9a9" (UID: "47af1407-6bc3-412b-b441-2e0c473eb9a9"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.062993 4780 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/47af1407-6bc3-412b-b441-2e0c473eb9a9-node-mnt\") on node \"crc\" DevicePath \"\"" Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.127500 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47af1407-6bc3-412b-b441-2e0c473eb9a9-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "47af1407-6bc3-412b-b441-2e0c473eb9a9" (UID: "47af1407-6bc3-412b-b441-2e0c473eb9a9"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.129739 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47af1407-6bc3-412b-b441-2e0c473eb9a9-kube-api-access-4hmww" (OuterVolumeSpecName: "kube-api-access-4hmww") pod "47af1407-6bc3-412b-b441-2e0c473eb9a9" (UID: "47af1407-6bc3-412b-b441-2e0c473eb9a9"). InnerVolumeSpecName "kube-api-access-4hmww". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.164852 4780 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/47af1407-6bc3-412b-b441-2e0c473eb9a9-crc-storage\") on node \"crc\" DevicePath \"\"" Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.164904 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hmww\" (UniqueName: \"kubernetes.io/projected/47af1407-6bc3-412b-b441-2e0c473eb9a9-kube-api-access-4hmww\") on node \"crc\" DevicePath \"\"" Sep 29 20:03:21 crc kubenswrapper[4780]: I0929 20:03:21.812437 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lp28f" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.067156 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-lp28f"] Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.071726 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-lp28f"] Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.203157 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-r9lnv"] Sep 29 20:03:23 crc kubenswrapper[4780]: E0929 20:03:23.203572 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47af1407-6bc3-412b-b441-2e0c473eb9a9" containerName="storage" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.203600 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="47af1407-6bc3-412b-b441-2e0c473eb9a9" containerName="storage" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.203883 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="47af1407-6bc3-412b-b441-2e0c473eb9a9" containerName="storage" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.204628 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.208253 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.208498 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.209703 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.210657 4780 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-76nlq" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.215369 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-r9lnv"] Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.298927 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfz8n\" (UniqueName: \"kubernetes.io/projected/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-kube-api-access-mfz8n\") pod \"crc-storage-crc-r9lnv\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.299394 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-node-mnt\") pod \"crc-storage-crc-r9lnv\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.299486 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-crc-storage\") pod \"crc-storage-crc-r9lnv\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.401479 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfz8n\" (UniqueName: \"kubernetes.io/projected/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-kube-api-access-mfz8n\") pod \"crc-storage-crc-r9lnv\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.401594 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-node-mnt\") pod \"crc-storage-crc-r9lnv\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.401684 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-crc-storage\") pod \"crc-storage-crc-r9lnv\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.402496 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-node-mnt\") pod \"crc-storage-crc-r9lnv\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.402965 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-crc-storage\") pod \"crc-storage-crc-r9lnv\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.431721 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfz8n\" (UniqueName: \"kubernetes.io/projected/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-kube-api-access-mfz8n\") pod \"crc-storage-crc-r9lnv\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.529561 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:23 crc kubenswrapper[4780]: I0929 20:03:23.823939 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-r9lnv"] Sep 29 20:03:24 crc kubenswrapper[4780]: I0929 20:03:24.773545 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47af1407-6bc3-412b-b441-2e0c473eb9a9" path="/var/lib/kubelet/pods/47af1407-6bc3-412b-b441-2e0c473eb9a9/volumes" Sep 29 20:03:24 crc kubenswrapper[4780]: I0929 20:03:24.840716 4780 generic.go:334] "Generic (PLEG): container finished" podID="1cdc2c27-4ab0-468e-a03f-d6a76dc3837b" containerID="8204c233ccbe8c2a46e6282daf2f12d70a2bcd5191925c61758dfd4d919e3335" exitCode=0 Sep 29 20:03:24 crc kubenswrapper[4780]: I0929 20:03:24.840797 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-r9lnv" event={"ID":"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b","Type":"ContainerDied","Data":"8204c233ccbe8c2a46e6282daf2f12d70a2bcd5191925c61758dfd4d919e3335"} Sep 29 20:03:24 crc kubenswrapper[4780]: I0929 20:03:24.840840 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-r9lnv" event={"ID":"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b","Type":"ContainerStarted","Data":"765457fb85ca760da1e52ed3a82a903f7c665b3b02861fb92a9db32cc0dba3f1"} Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.272968 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.450127 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-node-mnt\") pod \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.450237 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfz8n\" (UniqueName: \"kubernetes.io/projected/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-kube-api-access-mfz8n\") pod \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.450303 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-crc-storage\") pod \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\" (UID: \"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b\") " Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.450362 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1cdc2c27-4ab0-468e-a03f-d6a76dc3837b" (UID: "1cdc2c27-4ab0-468e-a03f-d6a76dc3837b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.450550 4780 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-node-mnt\") on node \"crc\" DevicePath \"\"" Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.459133 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-kube-api-access-mfz8n" (OuterVolumeSpecName: "kube-api-access-mfz8n") pod "1cdc2c27-4ab0-468e-a03f-d6a76dc3837b" (UID: "1cdc2c27-4ab0-468e-a03f-d6a76dc3837b"). InnerVolumeSpecName "kube-api-access-mfz8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.483703 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1cdc2c27-4ab0-468e-a03f-d6a76dc3837b" (UID: "1cdc2c27-4ab0-468e-a03f-d6a76dc3837b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.552095 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfz8n\" (UniqueName: \"kubernetes.io/projected/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-kube-api-access-mfz8n\") on node \"crc\" DevicePath \"\"" Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.552394 4780 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1cdc2c27-4ab0-468e-a03f-d6a76dc3837b-crc-storage\") on node \"crc\" DevicePath \"\"" Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.864447 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-r9lnv" event={"ID":"1cdc2c27-4ab0-468e-a03f-d6a76dc3837b","Type":"ContainerDied","Data":"765457fb85ca760da1e52ed3a82a903f7c665b3b02861fb92a9db32cc0dba3f1"} Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.864528 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="765457fb85ca760da1e52ed3a82a903f7c665b3b02861fb92a9db32cc0dba3f1" Sep 29 20:03:26 crc kubenswrapper[4780]: I0929 20:03:26.864580 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-r9lnv" Sep 29 20:03:27 crc kubenswrapper[4780]: I0929 20:03:27.164093 4780 scope.go:117] "RemoveContainer" containerID="36fca5ee83fc7460f10b3faa2dcf636fccff1bc70b19ff11a2a1ce4bfb2e92b9" Sep 29 20:03:33 crc kubenswrapper[4780]: I0929 20:03:33.223854 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:03:33 crc kubenswrapper[4780]: I0929 20:03:33.224603 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:03:33 crc kubenswrapper[4780]: I0929 20:03:33.224670 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 20:03:33 crc kubenswrapper[4780]: I0929 20:03:33.225567 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfb132c59b3079e843ab5ef45babfca3451e0db3c6b14052465d4ce66e9d1239"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 20:03:33 crc kubenswrapper[4780]: I0929 20:03:33.225671 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://dfb132c59b3079e843ab5ef45babfca3451e0db3c6b14052465d4ce66e9d1239" gracePeriod=600 Sep 29 20:03:33 crc kubenswrapper[4780]: I0929 20:03:33.931143 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="dfb132c59b3079e843ab5ef45babfca3451e0db3c6b14052465d4ce66e9d1239" exitCode=0 Sep 29 20:03:33 crc kubenswrapper[4780]: I0929 20:03:33.931208 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"dfb132c59b3079e843ab5ef45babfca3451e0db3c6b14052465d4ce66e9d1239"} Sep 29 20:03:33 crc kubenswrapper[4780]: I0929 20:03:33.931532 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af"} Sep 29 20:03:33 crc kubenswrapper[4780]: I0929 20:03:33.931567 4780 scope.go:117] "RemoveContainer" containerID="c8fda5a0bd65a0feb0b3ba8ce22aed003cb7efb7bb7499136c9a50b1e2624ffb" Sep 29 20:04:09 crc kubenswrapper[4780]: I0929 20:04:09.895953 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c4qlq"] Sep 29 20:04:09 crc kubenswrapper[4780]: E0929 20:04:09.897249 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cdc2c27-4ab0-468e-a03f-d6a76dc3837b" containerName="storage" Sep 29 20:04:09 crc kubenswrapper[4780]: I0929 20:04:09.897273 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cdc2c27-4ab0-468e-a03f-d6a76dc3837b" containerName="storage" Sep 29 20:04:09 crc kubenswrapper[4780]: I0929 20:04:09.897497 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cdc2c27-4ab0-468e-a03f-d6a76dc3837b" containerName="storage" Sep 29 20:04:09 crc kubenswrapper[4780]: I0929 20:04:09.899214 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:09 crc kubenswrapper[4780]: I0929 20:04:09.908512 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4qlq"] Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.011073 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqdsz\" (UniqueName: \"kubernetes.io/projected/2446a015-6a88-4716-af08-2772eae79551-kube-api-access-tqdsz\") pod \"redhat-operators-c4qlq\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.011125 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-utilities\") pod \"redhat-operators-c4qlq\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.011174 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-catalog-content\") pod \"redhat-operators-c4qlq\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.112595 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-catalog-content\") pod \"redhat-operators-c4qlq\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.113090 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqdsz\" (UniqueName: \"kubernetes.io/projected/2446a015-6a88-4716-af08-2772eae79551-kube-api-access-tqdsz\") pod \"redhat-operators-c4qlq\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.113303 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-utilities\") pod \"redhat-operators-c4qlq\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.113342 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-catalog-content\") pod \"redhat-operators-c4qlq\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.113670 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-utilities\") pod \"redhat-operators-c4qlq\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.134466 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqdsz\" (UniqueName: \"kubernetes.io/projected/2446a015-6a88-4716-af08-2772eae79551-kube-api-access-tqdsz\") pod \"redhat-operators-c4qlq\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.230459 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:10 crc kubenswrapper[4780]: I0929 20:04:10.707031 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4qlq"] Sep 29 20:04:11 crc kubenswrapper[4780]: I0929 20:04:11.295186 4780 generic.go:334] "Generic (PLEG): container finished" podID="2446a015-6a88-4716-af08-2772eae79551" containerID="af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874" exitCode=0 Sep 29 20:04:11 crc kubenswrapper[4780]: I0929 20:04:11.295433 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4qlq" event={"ID":"2446a015-6a88-4716-af08-2772eae79551","Type":"ContainerDied","Data":"af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874"} Sep 29 20:04:11 crc kubenswrapper[4780]: I0929 20:04:11.295458 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4qlq" event={"ID":"2446a015-6a88-4716-af08-2772eae79551","Type":"ContainerStarted","Data":"57f63c53a65c579580d6223212d4b94c05e8086c64c0a63e6b9b9e8b2337d97c"} Sep 29 20:04:13 crc kubenswrapper[4780]: I0929 20:04:13.318578 4780 generic.go:334] "Generic (PLEG): container finished" podID="2446a015-6a88-4716-af08-2772eae79551" containerID="ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85" exitCode=0 Sep 29 20:04:13 crc kubenswrapper[4780]: I0929 20:04:13.318670 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4qlq" event={"ID":"2446a015-6a88-4716-af08-2772eae79551","Type":"ContainerDied","Data":"ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85"} Sep 29 20:04:14 crc kubenswrapper[4780]: I0929 20:04:14.331155 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4qlq" event={"ID":"2446a015-6a88-4716-af08-2772eae79551","Type":"ContainerStarted","Data":"a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531"} Sep 29 20:04:14 crc kubenswrapper[4780]: I0929 20:04:14.368807 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c4qlq" podStartSLOduration=2.7778703240000002 podStartE2EDuration="5.368782393s" podCreationTimestamp="2025-09-29 20:04:09 +0000 UTC" firstStartedPulling="2025-09-29 20:04:11.297460276 +0000 UTC m=+4851.245758330" lastFinishedPulling="2025-09-29 20:04:13.888372325 +0000 UTC m=+4853.836670399" observedRunningTime="2025-09-29 20:04:14.359368663 +0000 UTC m=+4854.307666737" watchObservedRunningTime="2025-09-29 20:04:14.368782393 +0000 UTC m=+4854.317080477" Sep 29 20:04:20 crc kubenswrapper[4780]: I0929 20:04:20.231889 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:20 crc kubenswrapper[4780]: I0929 20:04:20.232636 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:20 crc kubenswrapper[4780]: I0929 20:04:20.297439 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:20 crc kubenswrapper[4780]: I0929 20:04:20.463999 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:20 crc kubenswrapper[4780]: I0929 20:04:20.551706 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4qlq"] Sep 29 20:04:22 crc kubenswrapper[4780]: I0929 20:04:22.412201 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c4qlq" podUID="2446a015-6a88-4716-af08-2772eae79551" containerName="registry-server" containerID="cri-o://a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531" gracePeriod=2 Sep 29 20:04:22 crc kubenswrapper[4780]: I0929 20:04:22.915903 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.019031 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqdsz\" (UniqueName: \"kubernetes.io/projected/2446a015-6a88-4716-af08-2772eae79551-kube-api-access-tqdsz\") pod \"2446a015-6a88-4716-af08-2772eae79551\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.019119 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-utilities\") pod \"2446a015-6a88-4716-af08-2772eae79551\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.019201 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-catalog-content\") pod \"2446a015-6a88-4716-af08-2772eae79551\" (UID: \"2446a015-6a88-4716-af08-2772eae79551\") " Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.020170 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-utilities" (OuterVolumeSpecName: "utilities") pod "2446a015-6a88-4716-af08-2772eae79551" (UID: "2446a015-6a88-4716-af08-2772eae79551"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.030770 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2446a015-6a88-4716-af08-2772eae79551-kube-api-access-tqdsz" (OuterVolumeSpecName: "kube-api-access-tqdsz") pod "2446a015-6a88-4716-af08-2772eae79551" (UID: "2446a015-6a88-4716-af08-2772eae79551"). InnerVolumeSpecName "kube-api-access-tqdsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.120727 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqdsz\" (UniqueName: \"kubernetes.io/projected/2446a015-6a88-4716-af08-2772eae79551-kube-api-access-tqdsz\") on node \"crc\" DevicePath \"\"" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.120759 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.425900 4780 generic.go:334] "Generic (PLEG): container finished" podID="2446a015-6a88-4716-af08-2772eae79551" containerID="a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531" exitCode=0 Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.425964 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4qlq" event={"ID":"2446a015-6a88-4716-af08-2772eae79551","Type":"ContainerDied","Data":"a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531"} Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.426007 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4qlq" event={"ID":"2446a015-6a88-4716-af08-2772eae79551","Type":"ContainerDied","Data":"57f63c53a65c579580d6223212d4b94c05e8086c64c0a63e6b9b9e8b2337d97c"} Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.426035 4780 scope.go:117] "RemoveContainer" containerID="a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.426084 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4qlq" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.457588 4780 scope.go:117] "RemoveContainer" containerID="ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.487471 4780 scope.go:117] "RemoveContainer" containerID="af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.535503 4780 scope.go:117] "RemoveContainer" containerID="a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531" Sep 29 20:04:23 crc kubenswrapper[4780]: E0929 20:04:23.536318 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531\": container with ID starting with a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531 not found: ID does not exist" containerID="a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.536402 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531"} err="failed to get container status \"a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531\": rpc error: code = NotFound desc = could not find container \"a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531\": container with ID starting with a6de04b9aef32b5707a3574d1bc477f05f1965ad0e945099640b1d861051e531 not found: ID does not exist" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.536435 4780 scope.go:117] "RemoveContainer" containerID="ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85" Sep 29 20:04:23 crc kubenswrapper[4780]: E0929 20:04:23.537306 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85\": container with ID starting with ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85 not found: ID does not exist" containerID="ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.537339 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85"} err="failed to get container status \"ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85\": rpc error: code = NotFound desc = could not find container \"ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85\": container with ID starting with ceba7a3bdbc064fed31d97c6e6c74cb4c3971bd368be737d5bdfb8a483fcde85 not found: ID does not exist" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.537365 4780 scope.go:117] "RemoveContainer" containerID="af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874" Sep 29 20:04:23 crc kubenswrapper[4780]: E0929 20:04:23.537835 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874\": container with ID starting with af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874 not found: ID does not exist" containerID="af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874" Sep 29 20:04:23 crc kubenswrapper[4780]: I0929 20:04:23.537859 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874"} err="failed to get container status \"af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874\": rpc error: code = NotFound desc = could not find container \"af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874\": container with ID starting with af933b85c31fd21760bacd3033449c575f2f9339c0982edc6f4f2ccf3e30a874 not found: ID does not exist" Sep 29 20:04:24 crc kubenswrapper[4780]: I0929 20:04:24.038405 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2446a015-6a88-4716-af08-2772eae79551" (UID: "2446a015-6a88-4716-af08-2772eae79551"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:04:24 crc kubenswrapper[4780]: I0929 20:04:24.137148 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2446a015-6a88-4716-af08-2772eae79551-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 20:04:24 crc kubenswrapper[4780]: I0929 20:04:24.383133 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4qlq"] Sep 29 20:04:24 crc kubenswrapper[4780]: I0929 20:04:24.393245 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c4qlq"] Sep 29 20:04:24 crc kubenswrapper[4780]: I0929 20:04:24.764511 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2446a015-6a88-4716-af08-2772eae79551" path="/var/lib/kubelet/pods/2446a015-6a88-4716-af08-2772eae79551/volumes" Sep 29 20:04:41 crc kubenswrapper[4780]: I0929 20:04:41.891420 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f6g7w"] Sep 29 20:04:41 crc kubenswrapper[4780]: E0929 20:04:41.892496 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2446a015-6a88-4716-af08-2772eae79551" containerName="registry-server" Sep 29 20:04:41 crc kubenswrapper[4780]: I0929 20:04:41.892519 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2446a015-6a88-4716-af08-2772eae79551" containerName="registry-server" Sep 29 20:04:41 crc kubenswrapper[4780]: E0929 20:04:41.892551 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2446a015-6a88-4716-af08-2772eae79551" containerName="extract-content" Sep 29 20:04:41 crc kubenswrapper[4780]: I0929 20:04:41.892565 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2446a015-6a88-4716-af08-2772eae79551" containerName="extract-content" Sep 29 20:04:41 crc kubenswrapper[4780]: E0929 20:04:41.892593 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2446a015-6a88-4716-af08-2772eae79551" containerName="extract-utilities" Sep 29 20:04:41 crc kubenswrapper[4780]: I0929 20:04:41.892606 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2446a015-6a88-4716-af08-2772eae79551" containerName="extract-utilities" Sep 29 20:04:41 crc kubenswrapper[4780]: I0929 20:04:41.892904 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2446a015-6a88-4716-af08-2772eae79551" containerName="registry-server" Sep 29 20:04:41 crc kubenswrapper[4780]: I0929 20:04:41.894797 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:41 crc kubenswrapper[4780]: I0929 20:04:41.909394 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6g7w"] Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.069394 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwm46\" (UniqueName: \"kubernetes.io/projected/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-kube-api-access-jwm46\") pod \"certified-operators-f6g7w\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.069863 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-utilities\") pod \"certified-operators-f6g7w\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.069964 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-catalog-content\") pod \"certified-operators-f6g7w\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.172372 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwm46\" (UniqueName: \"kubernetes.io/projected/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-kube-api-access-jwm46\") pod \"certified-operators-f6g7w\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.173569 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-utilities\") pod \"certified-operators-f6g7w\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.173701 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-catalog-content\") pod \"certified-operators-f6g7w\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.174381 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-utilities\") pod \"certified-operators-f6g7w\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.174665 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-catalog-content\") pod \"certified-operators-f6g7w\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.205149 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwm46\" (UniqueName: \"kubernetes.io/projected/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-kube-api-access-jwm46\") pod \"certified-operators-f6g7w\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.238118 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.301819 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q8qnz"] Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.305603 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.313268 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q8qnz"] Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.383513 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rh6s\" (UniqueName: \"kubernetes.io/projected/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-kube-api-access-5rh6s\") pod \"community-operators-q8qnz\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.383577 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-catalog-content\") pod \"community-operators-q8qnz\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.383603 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-utilities\") pod \"community-operators-q8qnz\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.485275 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rh6s\" (UniqueName: \"kubernetes.io/projected/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-kube-api-access-5rh6s\") pod \"community-operators-q8qnz\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.485631 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-catalog-content\") pod \"community-operators-q8qnz\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.485660 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-utilities\") pod \"community-operators-q8qnz\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.486184 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-utilities\") pod \"community-operators-q8qnz\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.487334 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-catalog-content\") pod \"community-operators-q8qnz\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.526130 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rh6s\" (UniqueName: \"kubernetes.io/projected/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-kube-api-access-5rh6s\") pod \"community-operators-q8qnz\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.661630 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.785898 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6g7w"] Sep 29 20:04:42 crc kubenswrapper[4780]: I0929 20:04:42.891684 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q8qnz"] Sep 29 20:04:43 crc kubenswrapper[4780]: I0929 20:04:43.613835 4780 generic.go:334] "Generic (PLEG): container finished" podID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerID="07bd4a17c506d96216689ec18fbac392b4dc16313cf6bc2bc2c90f35e8bcd4b8" exitCode=0 Sep 29 20:04:43 crc kubenswrapper[4780]: I0929 20:04:43.613961 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6g7w" event={"ID":"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16","Type":"ContainerDied","Data":"07bd4a17c506d96216689ec18fbac392b4dc16313cf6bc2bc2c90f35e8bcd4b8"} Sep 29 20:04:43 crc kubenswrapper[4780]: I0929 20:04:43.614578 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6g7w" event={"ID":"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16","Type":"ContainerStarted","Data":"193dd7dd12c60ed0c0f3657e9ecf5810c23b6d90aaa49c407996fd33e73a6238"} Sep 29 20:04:43 crc kubenswrapper[4780]: I0929 20:04:43.620607 4780 generic.go:334] "Generic (PLEG): container finished" podID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerID="1ca5d8d0a79e31534988518dff6b1a7bb7a85682854ab04e52f01e745093a124" exitCode=0 Sep 29 20:04:43 crc kubenswrapper[4780]: I0929 20:04:43.620677 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8qnz" event={"ID":"f36ab2ed-dc78-4b9f-907e-f02a63d69fea","Type":"ContainerDied","Data":"1ca5d8d0a79e31534988518dff6b1a7bb7a85682854ab04e52f01e745093a124"} Sep 29 20:04:43 crc kubenswrapper[4780]: I0929 20:04:43.620727 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8qnz" event={"ID":"f36ab2ed-dc78-4b9f-907e-f02a63d69fea","Type":"ContainerStarted","Data":"4be4a21d5262cee9ee54f4b949109644faa1f0fbf5da775c98f46a919941d194"} Sep 29 20:04:45 crc kubenswrapper[4780]: I0929 20:04:45.654364 4780 generic.go:334] "Generic (PLEG): container finished" podID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerID="1c7d65b340b8086a79bb8491ec42d7497a6d3c38f84487c133741a5e2f5a0e09" exitCode=0 Sep 29 20:04:45 crc kubenswrapper[4780]: I0929 20:04:45.654452 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8qnz" event={"ID":"f36ab2ed-dc78-4b9f-907e-f02a63d69fea","Type":"ContainerDied","Data":"1c7d65b340b8086a79bb8491ec42d7497a6d3c38f84487c133741a5e2f5a0e09"} Sep 29 20:04:45 crc kubenswrapper[4780]: I0929 20:04:45.661311 4780 generic.go:334] "Generic (PLEG): container finished" podID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerID="6e4f021a7fee56bfde295558947bd85d4626d9f230597405dd30532ee9de65d2" exitCode=0 Sep 29 20:04:45 crc kubenswrapper[4780]: I0929 20:04:45.661397 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6g7w" event={"ID":"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16","Type":"ContainerDied","Data":"6e4f021a7fee56bfde295558947bd85d4626d9f230597405dd30532ee9de65d2"} Sep 29 20:04:46 crc kubenswrapper[4780]: I0929 20:04:46.671301 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8qnz" event={"ID":"f36ab2ed-dc78-4b9f-907e-f02a63d69fea","Type":"ContainerStarted","Data":"4e50b7401daa16f55ef97673006aee29de629cb9dbe65182b53ad42e1018009d"} Sep 29 20:04:46 crc kubenswrapper[4780]: I0929 20:04:46.676295 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6g7w" event={"ID":"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16","Type":"ContainerStarted","Data":"3443bde10ae12813ec2481e86582bc5505a6ba550dbf2ccb9e0ca87a0833d0c7"} Sep 29 20:04:46 crc kubenswrapper[4780]: I0929 20:04:46.707036 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q8qnz" podStartSLOduration=2.260343628 podStartE2EDuration="4.707016477s" podCreationTimestamp="2025-09-29 20:04:42 +0000 UTC" firstStartedPulling="2025-09-29 20:04:43.6242185 +0000 UTC m=+4883.572516584" lastFinishedPulling="2025-09-29 20:04:46.070891359 +0000 UTC m=+4886.019189433" observedRunningTime="2025-09-29 20:04:46.69699108 +0000 UTC m=+4886.645289154" watchObservedRunningTime="2025-09-29 20:04:46.707016477 +0000 UTC m=+4886.655314531" Sep 29 20:04:46 crc kubenswrapper[4780]: I0929 20:04:46.729596 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f6g7w" podStartSLOduration=3.178903285 podStartE2EDuration="5.729567443s" podCreationTimestamp="2025-09-29 20:04:41 +0000 UTC" firstStartedPulling="2025-09-29 20:04:43.616337234 +0000 UTC m=+4883.564635278" lastFinishedPulling="2025-09-29 20:04:46.167001362 +0000 UTC m=+4886.115299436" observedRunningTime="2025-09-29 20:04:46.720585395 +0000 UTC m=+4886.668883479" watchObservedRunningTime="2025-09-29 20:04:46.729567443 +0000 UTC m=+4886.677865527" Sep 29 20:04:52 crc kubenswrapper[4780]: I0929 20:04:52.238992 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:52 crc kubenswrapper[4780]: I0929 20:04:52.239870 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:52 crc kubenswrapper[4780]: I0929 20:04:52.314371 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:52 crc kubenswrapper[4780]: I0929 20:04:52.662329 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:52 crc kubenswrapper[4780]: I0929 20:04:52.662429 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:52 crc kubenswrapper[4780]: I0929 20:04:52.737952 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:52 crc kubenswrapper[4780]: I0929 20:04:52.818450 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:52 crc kubenswrapper[4780]: I0929 20:04:52.830539 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:55 crc kubenswrapper[4780]: I0929 20:04:55.272827 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f6g7w"] Sep 29 20:04:55 crc kubenswrapper[4780]: I0929 20:04:55.273479 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f6g7w" podUID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerName="registry-server" containerID="cri-o://3443bde10ae12813ec2481e86582bc5505a6ba550dbf2ccb9e0ca87a0833d0c7" gracePeriod=2 Sep 29 20:04:55 crc kubenswrapper[4780]: I0929 20:04:55.771973 4780 generic.go:334] "Generic (PLEG): container finished" podID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerID="3443bde10ae12813ec2481e86582bc5505a6ba550dbf2ccb9e0ca87a0833d0c7" exitCode=0 Sep 29 20:04:55 crc kubenswrapper[4780]: I0929 20:04:55.772058 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6g7w" event={"ID":"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16","Type":"ContainerDied","Data":"3443bde10ae12813ec2481e86582bc5505a6ba550dbf2ccb9e0ca87a0833d0c7"} Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.266198 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.428198 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwm46\" (UniqueName: \"kubernetes.io/projected/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-kube-api-access-jwm46\") pod \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.429086 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-catalog-content\") pod \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.429198 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-utilities\") pod \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\" (UID: \"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16\") " Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.430903 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-utilities" (OuterVolumeSpecName: "utilities") pod "3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" (UID: "3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.433530 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-kube-api-access-jwm46" (OuterVolumeSpecName: "kube-api-access-jwm46") pod "3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" (UID: "3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16"). InnerVolumeSpecName "kube-api-access-jwm46". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.485831 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" (UID: "3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.531173 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwm46\" (UniqueName: \"kubernetes.io/projected/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-kube-api-access-jwm46\") on node \"crc\" DevicePath \"\"" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.531228 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.531250 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.785381 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6g7w" event={"ID":"3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16","Type":"ContainerDied","Data":"193dd7dd12c60ed0c0f3657e9ecf5810c23b6d90aaa49c407996fd33e73a6238"} Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.785466 4780 scope.go:117] "RemoveContainer" containerID="3443bde10ae12813ec2481e86582bc5505a6ba550dbf2ccb9e0ca87a0833d0c7" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.785464 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6g7w" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.814412 4780 scope.go:117] "RemoveContainer" containerID="6e4f021a7fee56bfde295558947bd85d4626d9f230597405dd30532ee9de65d2" Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.815385 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f6g7w"] Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.829317 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f6g7w"] Sep 29 20:04:56 crc kubenswrapper[4780]: I0929 20:04:56.837798 4780 scope.go:117] "RemoveContainer" containerID="07bd4a17c506d96216689ec18fbac392b4dc16313cf6bc2bc2c90f35e8bcd4b8" Sep 29 20:04:57 crc kubenswrapper[4780]: I0929 20:04:57.285170 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q8qnz"] Sep 29 20:04:57 crc kubenswrapper[4780]: I0929 20:04:57.285774 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q8qnz" podUID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerName="registry-server" containerID="cri-o://4e50b7401daa16f55ef97673006aee29de629cb9dbe65182b53ad42e1018009d" gracePeriod=2 Sep 29 20:04:57 crc kubenswrapper[4780]: I0929 20:04:57.802005 4780 generic.go:334] "Generic (PLEG): container finished" podID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerID="4e50b7401daa16f55ef97673006aee29de629cb9dbe65182b53ad42e1018009d" exitCode=0 Sep 29 20:04:57 crc kubenswrapper[4780]: I0929 20:04:57.802104 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8qnz" event={"ID":"f36ab2ed-dc78-4b9f-907e-f02a63d69fea","Type":"ContainerDied","Data":"4e50b7401daa16f55ef97673006aee29de629cb9dbe65182b53ad42e1018009d"} Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.005928 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.157506 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-catalog-content\") pod \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.157579 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-utilities\") pod \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.157802 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rh6s\" (UniqueName: \"kubernetes.io/projected/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-kube-api-access-5rh6s\") pod \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\" (UID: \"f36ab2ed-dc78-4b9f-907e-f02a63d69fea\") " Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.160126 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-utilities" (OuterVolumeSpecName: "utilities") pod "f36ab2ed-dc78-4b9f-907e-f02a63d69fea" (UID: "f36ab2ed-dc78-4b9f-907e-f02a63d69fea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.167036 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-kube-api-access-5rh6s" (OuterVolumeSpecName: "kube-api-access-5rh6s") pod "f36ab2ed-dc78-4b9f-907e-f02a63d69fea" (UID: "f36ab2ed-dc78-4b9f-907e-f02a63d69fea"). InnerVolumeSpecName "kube-api-access-5rh6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.236350 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f36ab2ed-dc78-4b9f-907e-f02a63d69fea" (UID: "f36ab2ed-dc78-4b9f-907e-f02a63d69fea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.260899 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rh6s\" (UniqueName: \"kubernetes.io/projected/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-kube-api-access-5rh6s\") on node \"crc\" DevicePath \"\"" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.261003 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.261030 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36ab2ed-dc78-4b9f-907e-f02a63d69fea-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.773481 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" path="/var/lib/kubelet/pods/3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16/volumes" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.827671 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8qnz" event={"ID":"f36ab2ed-dc78-4b9f-907e-f02a63d69fea","Type":"ContainerDied","Data":"4be4a21d5262cee9ee54f4b949109644faa1f0fbf5da775c98f46a919941d194"} Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.828171 4780 scope.go:117] "RemoveContainer" containerID="4e50b7401daa16f55ef97673006aee29de629cb9dbe65182b53ad42e1018009d" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.827751 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8qnz" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.866781 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q8qnz"] Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.870951 4780 scope.go:117] "RemoveContainer" containerID="1c7d65b340b8086a79bb8491ec42d7497a6d3c38f84487c133741a5e2f5a0e09" Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.878102 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q8qnz"] Sep 29 20:04:58 crc kubenswrapper[4780]: I0929 20:04:58.905360 4780 scope.go:117] "RemoveContainer" containerID="1ca5d8d0a79e31534988518dff6b1a7bb7a85682854ab04e52f01e745093a124" Sep 29 20:05:00 crc kubenswrapper[4780]: I0929 20:05:00.771361 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" path="/var/lib/kubelet/pods/f36ab2ed-dc78-4b9f-907e-f02a63d69fea/volumes" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.979893 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-87cd8867c-gl8ld"] Sep 29 20:05:21 crc kubenswrapper[4780]: E0929 20:05:21.980796 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerName="registry-server" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.980813 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerName="registry-server" Sep 29 20:05:21 crc kubenswrapper[4780]: E0929 20:05:21.980830 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerName="extract-utilities" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.980837 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerName="extract-utilities" Sep 29 20:05:21 crc kubenswrapper[4780]: E0929 20:05:21.980855 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerName="extract-utilities" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.980863 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerName="extract-utilities" Sep 29 20:05:21 crc kubenswrapper[4780]: E0929 20:05:21.980874 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerName="extract-content" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.980881 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerName="extract-content" Sep 29 20:05:21 crc kubenswrapper[4780]: E0929 20:05:21.980892 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerName="registry-server" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.980899 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerName="registry-server" Sep 29 20:05:21 crc kubenswrapper[4780]: E0929 20:05:21.980911 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerName="extract-content" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.980919 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerName="extract-content" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.981153 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f36ab2ed-dc78-4b9f-907e-f02a63d69fea" containerName="registry-server" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.981171 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee2dd91-12ee-42bc-8811-d0ca4d4c8d16" containerName="registry-server" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.982039 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.984386 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-h7pt6" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.984432 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.984518 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.984566 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 29 20:05:21 crc kubenswrapper[4780]: I0929 20:05:21.997442 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-87cd8867c-gl8ld"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.042372 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-775cb64d69-kj6fn"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.043666 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.046716 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.049220 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775cb64d69-kj6fn"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.060276 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw5ts\" (UniqueName: \"kubernetes.io/projected/64d5e94c-dbb8-4aa4-b416-8b84137155ff-kube-api-access-bw5ts\") pod \"dnsmasq-dns-87cd8867c-gl8ld\" (UID: \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\") " pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.060604 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-dns-svc\") pod \"dnsmasq-dns-775cb64d69-kj6fn\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.060756 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsg6l\" (UniqueName: \"kubernetes.io/projected/2277e48b-d2cb-4512-8a19-dc04b00a919e-kube-api-access-rsg6l\") pod \"dnsmasq-dns-775cb64d69-kj6fn\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.060854 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-config\") pod \"dnsmasq-dns-775cb64d69-kj6fn\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.060960 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d5e94c-dbb8-4aa4-b416-8b84137155ff-config\") pod \"dnsmasq-dns-87cd8867c-gl8ld\" (UID: \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\") " pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.162014 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw5ts\" (UniqueName: \"kubernetes.io/projected/64d5e94c-dbb8-4aa4-b416-8b84137155ff-kube-api-access-bw5ts\") pod \"dnsmasq-dns-87cd8867c-gl8ld\" (UID: \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\") " pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.162105 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-dns-svc\") pod \"dnsmasq-dns-775cb64d69-kj6fn\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.162141 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsg6l\" (UniqueName: \"kubernetes.io/projected/2277e48b-d2cb-4512-8a19-dc04b00a919e-kube-api-access-rsg6l\") pod \"dnsmasq-dns-775cb64d69-kj6fn\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.162163 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d5e94c-dbb8-4aa4-b416-8b84137155ff-config\") pod \"dnsmasq-dns-87cd8867c-gl8ld\" (UID: \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\") " pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.162182 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-config\") pod \"dnsmasq-dns-775cb64d69-kj6fn\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.163130 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-config\") pod \"dnsmasq-dns-775cb64d69-kj6fn\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.163145 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-dns-svc\") pod \"dnsmasq-dns-775cb64d69-kj6fn\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.163397 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d5e94c-dbb8-4aa4-b416-8b84137155ff-config\") pod \"dnsmasq-dns-87cd8867c-gl8ld\" (UID: \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\") " pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.184021 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsg6l\" (UniqueName: \"kubernetes.io/projected/2277e48b-d2cb-4512-8a19-dc04b00a919e-kube-api-access-rsg6l\") pod \"dnsmasq-dns-775cb64d69-kj6fn\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.184031 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw5ts\" (UniqueName: \"kubernetes.io/projected/64d5e94c-dbb8-4aa4-b416-8b84137155ff-kube-api-access-bw5ts\") pod \"dnsmasq-dns-87cd8867c-gl8ld\" (UID: \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\") " pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.297276 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.363789 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.448602 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775cb64d69-kj6fn"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.451576 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-rdx5g"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.452851 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.457430 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-rdx5g"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.573559 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-dns-svc\") pod \"dnsmasq-dns-bc7bd85-rdx5g\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.573604 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-config\") pod \"dnsmasq-dns-bc7bd85-rdx5g\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.573641 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8xph\" (UniqueName: \"kubernetes.io/projected/ce7f542b-0e9c-4976-b537-20a916c01d27-kube-api-access-p8xph\") pod \"dnsmasq-dns-bc7bd85-rdx5g\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.678028 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-dns-svc\") pod \"dnsmasq-dns-bc7bd85-rdx5g\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.678378 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-config\") pod \"dnsmasq-dns-bc7bd85-rdx5g\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.678412 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8xph\" (UniqueName: \"kubernetes.io/projected/ce7f542b-0e9c-4976-b537-20a916c01d27-kube-api-access-p8xph\") pod \"dnsmasq-dns-bc7bd85-rdx5g\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.678820 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-dns-svc\") pod \"dnsmasq-dns-bc7bd85-rdx5g\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.679205 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-config\") pod \"dnsmasq-dns-bc7bd85-rdx5g\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.701573 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775cb64d69-kj6fn"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.705918 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8xph\" (UniqueName: \"kubernetes.io/projected/ce7f542b-0e9c-4976-b537-20a916c01d27-kube-api-access-p8xph\") pod \"dnsmasq-dns-bc7bd85-rdx5g\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.799824 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.817402 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-87cd8867c-gl8ld"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.833382 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-87cd8867c-gl8ld"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.864873 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-q98wd"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.867005 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.875915 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-q98wd"] Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.880267 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxs5\" (UniqueName: \"kubernetes.io/projected/0037fee6-408c-46d7-a95d-1028921f3f0f-kube-api-access-kzxs5\") pod \"dnsmasq-dns-5f455d6d69-q98wd\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.880325 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-config\") pod \"dnsmasq-dns-5f455d6d69-q98wd\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.880369 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-dns-svc\") pod \"dnsmasq-dns-5f455d6d69-q98wd\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.981718 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxs5\" (UniqueName: \"kubernetes.io/projected/0037fee6-408c-46d7-a95d-1028921f3f0f-kube-api-access-kzxs5\") pod \"dnsmasq-dns-5f455d6d69-q98wd\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:22 crc kubenswrapper[4780]: I0929 20:05:22.981773 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-config\") pod \"dnsmasq-dns-5f455d6d69-q98wd\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:22.981816 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-dns-svc\") pod \"dnsmasq-dns-5f455d6d69-q98wd\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:22.982828 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-dns-svc\") pod \"dnsmasq-dns-5f455d6d69-q98wd\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:22.983147 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-config\") pod \"dnsmasq-dns-5f455d6d69-q98wd\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.007846 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxs5\" (UniqueName: \"kubernetes.io/projected/0037fee6-408c-46d7-a95d-1028921f3f0f-kube-api-access-kzxs5\") pod \"dnsmasq-dns-5f455d6d69-q98wd\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.077790 4780 generic.go:334] "Generic (PLEG): container finished" podID="2277e48b-d2cb-4512-8a19-dc04b00a919e" containerID="1ca060b75d50a27d2f24c9515077ebc457c0d93ced0c68854c0b5faa4f05555e" exitCode=0 Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.077892 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" event={"ID":"2277e48b-d2cb-4512-8a19-dc04b00a919e","Type":"ContainerDied","Data":"1ca060b75d50a27d2f24c9515077ebc457c0d93ced0c68854c0b5faa4f05555e"} Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.078225 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" event={"ID":"2277e48b-d2cb-4512-8a19-dc04b00a919e","Type":"ContainerStarted","Data":"cbb0f2db4b4e6e55def130db5e016e69bc9e5dbb8ab8ccdd8eabfc64b260ae91"} Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.079310 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" event={"ID":"64d5e94c-dbb8-4aa4-b416-8b84137155ff","Type":"ContainerStarted","Data":"45e84fe4113bb381b791b8ebef3684b4089b2ce0d94a316e66f724e0a601b23c"} Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.201469 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.315181 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-rdx5g"] Sep 29 20:05:23 crc kubenswrapper[4780]: W0929 20:05:23.428603 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce7f542b_0e9c_4976_b537_20a916c01d27.slice/crio-2451ad2b6e3154a4c769d298477ddb78f5326a46010026418839952a131f826b WatchSource:0}: Error finding container 2451ad2b6e3154a4c769d298477ddb78f5326a46010026418839952a131f826b: Status 404 returned error can't find the container with id 2451ad2b6e3154a4c769d298477ddb78f5326a46010026418839952a131f826b Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.548263 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.581608 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 20:05:23 crc kubenswrapper[4780]: E0929 20:05:23.583024 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2277e48b-d2cb-4512-8a19-dc04b00a919e" containerName="init" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.583763 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2277e48b-d2cb-4512-8a19-dc04b00a919e" containerName="init" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.586206 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2277e48b-d2cb-4512-8a19-dc04b00a919e" containerName="init" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.590775 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.593630 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.595264 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.595901 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g5jlk" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.597646 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.597653 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.598532 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.610911 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.610960 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.651986 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-q98wd"] Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.693584 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-config\") pod \"2277e48b-d2cb-4512-8a19-dc04b00a919e\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.693638 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsg6l\" (UniqueName: \"kubernetes.io/projected/2277e48b-d2cb-4512-8a19-dc04b00a919e-kube-api-access-rsg6l\") pod \"2277e48b-d2cb-4512-8a19-dc04b00a919e\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.693713 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-dns-svc\") pod \"2277e48b-d2cb-4512-8a19-dc04b00a919e\" (UID: \"2277e48b-d2cb-4512-8a19-dc04b00a919e\") " Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.693894 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694104 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694293 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4009b891-9e80-40fb-8205-eddb545b5424-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694364 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4009b891-9e80-40fb-8205-eddb545b5424-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694391 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694523 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vw9\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-kube-api-access-z9vw9\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694614 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694642 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694724 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694808 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.694841 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-config-data\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.708332 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2277e48b-d2cb-4512-8a19-dc04b00a919e-kube-api-access-rsg6l" (OuterVolumeSpecName: "kube-api-access-rsg6l") pod "2277e48b-d2cb-4512-8a19-dc04b00a919e" (UID: "2277e48b-d2cb-4512-8a19-dc04b00a919e"). InnerVolumeSpecName "kube-api-access-rsg6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:05:23 crc kubenswrapper[4780]: W0929 20:05:23.709337 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0037fee6_408c_46d7_a95d_1028921f3f0f.slice/crio-58868b19a17e7333f55d77f925738ce3023efa72d97ba39720085a6d6260ae85 WatchSource:0}: Error finding container 58868b19a17e7333f55d77f925738ce3023efa72d97ba39720085a6d6260ae85: Status 404 returned error can't find the container with id 58868b19a17e7333f55d77f925738ce3023efa72d97ba39720085a6d6260ae85 Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.711929 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-config" (OuterVolumeSpecName: "config") pod "2277e48b-d2cb-4512-8a19-dc04b00a919e" (UID: "2277e48b-d2cb-4512-8a19-dc04b00a919e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.711999 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2277e48b-d2cb-4512-8a19-dc04b00a919e" (UID: "2277e48b-d2cb-4512-8a19-dc04b00a919e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.795771 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796029 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796061 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4009b891-9e80-40fb-8205-eddb545b5424-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796080 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4009b891-9e80-40fb-8205-eddb545b5424-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796097 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796161 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vw9\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-kube-api-access-z9vw9\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796194 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796207 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796227 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796264 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796283 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-config-data\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796320 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsg6l\" (UniqueName: \"kubernetes.io/projected/2277e48b-d2cb-4512-8a19-dc04b00a919e-kube-api-access-rsg6l\") on node \"crc\" DevicePath \"\"" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796330 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796340 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2277e48b-d2cb-4512-8a19-dc04b00a919e-config\") on node \"crc\" DevicePath \"\"" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.796724 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.797251 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.797766 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-config-data\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.798581 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.801096 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.801146 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.801646 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4009b891-9e80-40fb-8205-eddb545b5424-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.801676 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4009b891-9e80-40fb-8205-eddb545b5424-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.801654 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.801765 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.801797 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0e285308d095dbc190ec4eda8d58fb5c556ca750eaac2fbd6c0192ddeec8db59/globalmount\"" pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.818469 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vw9\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-kube-api-access-z9vw9\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.847195 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") pod \"rabbitmq-server-0\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.925537 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 20:05:23 crc kubenswrapper[4780]: I0929 20:05:23.997613 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.000120 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.007032 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.007237 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.007562 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.007637 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tnlhl" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.007560 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.008292 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.008295 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.015263 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.098195 4780 generic.go:334] "Generic (PLEG): container finished" podID="0037fee6-408c-46d7-a95d-1028921f3f0f" containerID="dc1247fb0bd8bdfa59323230295d5f5c7b1da19eceb7ea65df9c6ebf7f874939" exitCode=0 Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.098250 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" event={"ID":"0037fee6-408c-46d7-a95d-1028921f3f0f","Type":"ContainerDied","Data":"dc1247fb0bd8bdfa59323230295d5f5c7b1da19eceb7ea65df9c6ebf7f874939"} Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.098275 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" event={"ID":"0037fee6-408c-46d7-a95d-1028921f3f0f","Type":"ContainerStarted","Data":"58868b19a17e7333f55d77f925738ce3023efa72d97ba39720085a6d6260ae85"} Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101513 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49m4t\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-kube-api-access-49m4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101593 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101631 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101679 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb526eee-bf93-413b-af6b-d851daf166e0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101724 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101781 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101810 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101846 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101891 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb526eee-bf93-413b-af6b-d851daf166e0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101929 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.101955 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.111317 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.111304 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775cb64d69-kj6fn" event={"ID":"2277e48b-d2cb-4512-8a19-dc04b00a919e","Type":"ContainerDied","Data":"cbb0f2db4b4e6e55def130db5e016e69bc9e5dbb8ab8ccdd8eabfc64b260ae91"} Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.111699 4780 scope.go:117] "RemoveContainer" containerID="1ca060b75d50a27d2f24c9515077ebc457c0d93ced0c68854c0b5faa4f05555e" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.115377 4780 generic.go:334] "Generic (PLEG): container finished" podID="64d5e94c-dbb8-4aa4-b416-8b84137155ff" containerID="884127f6fa85ae1dc219f2b10f8b5a031bea11a815fc875ad9722f2943d66dff" exitCode=0 Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.115482 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" event={"ID":"64d5e94c-dbb8-4aa4-b416-8b84137155ff","Type":"ContainerDied","Data":"884127f6fa85ae1dc219f2b10f8b5a031bea11a815fc875ad9722f2943d66dff"} Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.126681 4780 generic.go:334] "Generic (PLEG): container finished" podID="ce7f542b-0e9c-4976-b537-20a916c01d27" containerID="41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c" exitCode=0 Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.126755 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" event={"ID":"ce7f542b-0e9c-4976-b537-20a916c01d27","Type":"ContainerDied","Data":"41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c"} Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.126795 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" event={"ID":"ce7f542b-0e9c-4976-b537-20a916c01d27","Type":"ContainerStarted","Data":"2451ad2b6e3154a4c769d298477ddb78f5326a46010026418839952a131f826b"} Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203408 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb526eee-bf93-413b-af6b-d851daf166e0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203459 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203516 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203540 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203567 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203608 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb526eee-bf93-413b-af6b-d851daf166e0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203630 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203647 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203827 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49m4t\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-kube-api-access-49m4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203866 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.203888 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.204456 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.205000 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.205289 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.206480 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.209712 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.258905 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: E0929 20:05:24.268800 4780 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2277e48b_d2cb_4512_8a19_dc04b00a919e.slice/crio-cbb0f2db4b4e6e55def130db5e016e69bc9e5dbb8ab8ccdd8eabfc64b260ae91\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2277e48b_d2cb_4512_8a19_dc04b00a919e.slice\": RecentStats: unable to find data in memory cache]" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.281333 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.282222 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb526eee-bf93-413b-af6b-d851daf166e0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.282486 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.282507 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4945b6305221935745bdf4e6c1d62ba3d9fd2f307f257b88ecb8f238eb891f49/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.285776 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb526eee-bf93-413b-af6b-d851daf166e0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.290702 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49m4t\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-kube-api-access-49m4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.309783 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775cb64d69-kj6fn"] Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.312555 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-775cb64d69-kj6fn"] Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.359322 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: E0929 20:05:24.454988 4780 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 29 20:05:24 crc kubenswrapper[4780]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/ce7f542b-0e9c-4976-b537-20a916c01d27/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 20:05:24 crc kubenswrapper[4780]: > podSandboxID="2451ad2b6e3154a4c769d298477ddb78f5326a46010026418839952a131f826b" Sep 29 20:05:24 crc kubenswrapper[4780]: E0929 20:05:24.455375 4780 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 29 20:05:24 crc kubenswrapper[4780]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:6276771339c90f342673dcaf7faa8c46e2c0ece62ed5efc4b7d65a095dabe07b,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8xph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bc7bd85-rdx5g_openstack(ce7f542b-0e9c-4976-b537-20a916c01d27): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/ce7f542b-0e9c-4976-b537-20a916c01d27/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 29 20:05:24 crc kubenswrapper[4780]: > logger="UnhandledError" Sep 29 20:05:24 crc kubenswrapper[4780]: E0929 20:05:24.456551 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/ce7f542b-0e9c-4976-b537-20a916c01d27/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" podUID="ce7f542b-0e9c-4976-b537-20a916c01d27" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.624315 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:24 crc kubenswrapper[4780]: I0929 20:05:24.763003 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2277e48b-d2cb-4512-8a19-dc04b00a919e" path="/var/lib/kubelet/pods/2277e48b-d2cb-4512-8a19-dc04b00a919e/volumes" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.024522 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.120125 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw5ts\" (UniqueName: \"kubernetes.io/projected/64d5e94c-dbb8-4aa4-b416-8b84137155ff-kube-api-access-bw5ts\") pod \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\" (UID: \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\") " Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.120175 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d5e94c-dbb8-4aa4-b416-8b84137155ff-config\") pod \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\" (UID: \"64d5e94c-dbb8-4aa4-b416-8b84137155ff\") " Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.126529 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d5e94c-dbb8-4aa4-b416-8b84137155ff-kube-api-access-bw5ts" (OuterVolumeSpecName: "kube-api-access-bw5ts") pod "64d5e94c-dbb8-4aa4-b416-8b84137155ff" (UID: "64d5e94c-dbb8-4aa4-b416-8b84137155ff"). InnerVolumeSpecName "kube-api-access-bw5ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.138429 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" event={"ID":"0037fee6-408c-46d7-a95d-1028921f3f0f","Type":"ContainerStarted","Data":"5b218c9d686a5e37dc9e79a0f4d7cf1bbb8d79f5969175ddd4d313762a2b53f4"} Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.142723 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" event={"ID":"64d5e94c-dbb8-4aa4-b416-8b84137155ff","Type":"ContainerDied","Data":"45e84fe4113bb381b791b8ebef3684b4089b2ce0d94a316e66f724e0a601b23c"} Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.142752 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87cd8867c-gl8ld" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.142796 4780 scope.go:117] "RemoveContainer" containerID="884127f6fa85ae1dc219f2b10f8b5a031bea11a815fc875ad9722f2943d66dff" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.145317 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64d5e94c-dbb8-4aa4-b416-8b84137155ff-config" (OuterVolumeSpecName: "config") pod "64d5e94c-dbb8-4aa4-b416-8b84137155ff" (UID: "64d5e94c-dbb8-4aa4-b416-8b84137155ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.164830 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" podStartSLOduration=3.164795387 podStartE2EDuration="3.164795387s" podCreationTimestamp="2025-09-29 20:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:05:25.155696516 +0000 UTC m=+4925.103994560" watchObservedRunningTime="2025-09-29 20:05:25.164795387 +0000 UTC m=+4925.113093431" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.212618 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 20:05:25 crc kubenswrapper[4780]: W0929 20:05:25.220419 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4009b891_9e80_40fb_8205_eddb545b5424.slice/crio-c13d7ba4f7b1f57b7281c725bd339adecfa9d320b8451003c6cc14539de7ea9b WatchSource:0}: Error finding container c13d7ba4f7b1f57b7281c725bd339adecfa9d320b8451003c6cc14539de7ea9b: Status 404 returned error can't find the container with id c13d7ba4f7b1f57b7281c725bd339adecfa9d320b8451003c6cc14539de7ea9b Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.221496 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.222370 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw5ts\" (UniqueName: \"kubernetes.io/projected/64d5e94c-dbb8-4aa4-b416-8b84137155ff-kube-api-access-bw5ts\") on node \"crc\" DevicePath \"\"" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.222399 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d5e94c-dbb8-4aa4-b416-8b84137155ff-config\") on node \"crc\" DevicePath \"\"" Sep 29 20:05:25 crc kubenswrapper[4780]: W0929 20:05:25.256397 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb526eee_bf93_413b_af6b_d851daf166e0.slice/crio-36bbfdd4477bab3ab503141494a76ef0cab8a5b33d8aeb0417a36694e21e443f WatchSource:0}: Error finding container 36bbfdd4477bab3ab503141494a76ef0cab8a5b33d8aeb0417a36694e21e443f: Status 404 returned error can't find the container with id 36bbfdd4477bab3ab503141494a76ef0cab8a5b33d8aeb0417a36694e21e443f Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.483067 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-87cd8867c-gl8ld"] Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.494295 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-87cd8867c-gl8ld"] Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.636204 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 20:05:25 crc kubenswrapper[4780]: E0929 20:05:25.636624 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d5e94c-dbb8-4aa4-b416-8b84137155ff" containerName="init" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.636670 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d5e94c-dbb8-4aa4-b416-8b84137155ff" containerName="init" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.636941 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="64d5e94c-dbb8-4aa4-b416-8b84137155ff" containerName="init" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.638069 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.640302 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.640403 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-p5w75" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.640452 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.640769 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.642138 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.646531 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.653729 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.728637 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98e8d940-9717-43b2-9919-f12c4218b3f4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.728696 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98e8d940-9717-43b2-9919-f12c4218b3f4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.728817 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98e8d940-9717-43b2-9919-f12c4218b3f4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.728850 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd7lm\" (UniqueName: \"kubernetes.io/projected/98e8d940-9717-43b2-9919-f12c4218b3f4-kube-api-access-kd7lm\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.728905 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-03c34bcf-8496-4e39-9f87-3e1ac48bfa52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-03c34bcf-8496-4e39-9f87-3e1ac48bfa52\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.728978 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/98e8d940-9717-43b2-9919-f12c4218b3f4-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.728996 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e8d940-9717-43b2-9919-f12c4218b3f4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.729116 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98e8d940-9717-43b2-9919-f12c4218b3f4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.729204 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98e8d940-9717-43b2-9919-f12c4218b3f4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.830289 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98e8d940-9717-43b2-9919-f12c4218b3f4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.830351 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98e8d940-9717-43b2-9919-f12c4218b3f4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.830379 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98e8d940-9717-43b2-9919-f12c4218b3f4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.830423 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98e8d940-9717-43b2-9919-f12c4218b3f4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.830456 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd7lm\" (UniqueName: \"kubernetes.io/projected/98e8d940-9717-43b2-9919-f12c4218b3f4-kube-api-access-kd7lm\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.830473 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-03c34bcf-8496-4e39-9f87-3e1ac48bfa52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-03c34bcf-8496-4e39-9f87-3e1ac48bfa52\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.830506 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/98e8d940-9717-43b2-9919-f12c4218b3f4-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.830521 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e8d940-9717-43b2-9919-f12c4218b3f4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.830566 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98e8d940-9717-43b2-9919-f12c4218b3f4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.831482 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98e8d940-9717-43b2-9919-f12c4218b3f4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.831885 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98e8d940-9717-43b2-9919-f12c4218b3f4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.832004 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98e8d940-9717-43b2-9919-f12c4218b3f4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.832242 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98e8d940-9717-43b2-9919-f12c4218b3f4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.835638 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.835681 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-03c34bcf-8496-4e39-9f87-3e1ac48bfa52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-03c34bcf-8496-4e39-9f87-3e1ac48bfa52\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3a0ec6fc5d5c7a028afd79f94dbc68b89335102e66e292b65915e64135674e2e/globalmount\"" pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.836276 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/98e8d940-9717-43b2-9919-f12c4218b3f4-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.838126 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e8d940-9717-43b2-9919-f12c4218b3f4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.838135 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98e8d940-9717-43b2-9919-f12c4218b3f4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.926401 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd7lm\" (UniqueName: \"kubernetes.io/projected/98e8d940-9717-43b2-9919-f12c4218b3f4-kube-api-access-kd7lm\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.954795 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-03c34bcf-8496-4e39-9f87-3e1ac48bfa52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-03c34bcf-8496-4e39-9f87-3e1ac48bfa52\") pod \"openstack-cell1-galera-0\" (UID: \"98e8d940-9717-43b2-9919-f12c4218b3f4\") " pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:25 crc kubenswrapper[4780]: I0929 20:05:25.960921 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.010113 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.011299 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.015028 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.015244 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.015602 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.015673 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-pk9z8" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.058548 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.135722 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec488a3f-cd31-4c53-817e-22c302ab7678-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.136061 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec488a3f-cd31-4c53-817e-22c302ab7678-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.136089 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec488a3f-cd31-4c53-817e-22c302ab7678-config-data-default\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.136135 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ec78ab7d-75a5-4e60-9650-53cdee9213f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ec78ab7d-75a5-4e60-9650-53cdee9213f4\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.136191 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec488a3f-cd31-4c53-817e-22c302ab7678-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.136209 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ec488a3f-cd31-4c53-817e-22c302ab7678-secrets\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.136241 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec488a3f-cd31-4c53-817e-22c302ab7678-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.136291 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-589wk\" (UniqueName: \"kubernetes.io/projected/ec488a3f-cd31-4c53-817e-22c302ab7678-kube-api-access-589wk\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.136318 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec488a3f-cd31-4c53-817e-22c302ab7678-kolla-config\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.151589 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" event={"ID":"ce7f542b-0e9c-4976-b537-20a916c01d27","Type":"ContainerStarted","Data":"06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d"} Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.151830 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.155178 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb526eee-bf93-413b-af6b-d851daf166e0","Type":"ContainerStarted","Data":"10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6"} Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.155222 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb526eee-bf93-413b-af6b-d851daf166e0","Type":"ContainerStarted","Data":"36bbfdd4477bab3ab503141494a76ef0cab8a5b33d8aeb0417a36694e21e443f"} Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.159714 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4009b891-9e80-40fb-8205-eddb545b5424","Type":"ContainerStarted","Data":"92757498b50f98edc1a9df4bd51360d9991efe26e6409aad81aa0557e0bc67cb"} Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.159781 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4009b891-9e80-40fb-8205-eddb545b5424","Type":"ContainerStarted","Data":"c13d7ba4f7b1f57b7281c725bd339adecfa9d320b8451003c6cc14539de7ea9b"} Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.162294 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.177880 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" podStartSLOduration=4.17786074 podStartE2EDuration="4.17786074s" podCreationTimestamp="2025-09-29 20:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:05:26.171737814 +0000 UTC m=+4926.120035868" watchObservedRunningTime="2025-09-29 20:05:26.17786074 +0000 UTC m=+4926.126158794" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.237892 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec488a3f-cd31-4c53-817e-22c302ab7678-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.240809 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ec488a3f-cd31-4c53-817e-22c302ab7678-secrets\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.240844 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec488a3f-cd31-4c53-817e-22c302ab7678-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.240936 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-589wk\" (UniqueName: \"kubernetes.io/projected/ec488a3f-cd31-4c53-817e-22c302ab7678-kube-api-access-589wk\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.240961 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec488a3f-cd31-4c53-817e-22c302ab7678-kolla-config\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.241022 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec488a3f-cd31-4c53-817e-22c302ab7678-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.241037 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec488a3f-cd31-4c53-817e-22c302ab7678-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.241085 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec488a3f-cd31-4c53-817e-22c302ab7678-config-data-default\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.241154 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ec78ab7d-75a5-4e60-9650-53cdee9213f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ec78ab7d-75a5-4e60-9650-53cdee9213f4\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.238447 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec488a3f-cd31-4c53-817e-22c302ab7678-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.243658 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec488a3f-cd31-4c53-817e-22c302ab7678-config-data-default\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.244353 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec488a3f-cd31-4c53-817e-22c302ab7678-kolla-config\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.246476 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.246623 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ec78ab7d-75a5-4e60-9650-53cdee9213f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ec78ab7d-75a5-4e60-9650-53cdee9213f4\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5bf45e3a54f0f9f01efc59a5810944fe9cc47b8480a1cada6f4ec8fddba922e7/globalmount\"" pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.247213 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec488a3f-cd31-4c53-817e-22c302ab7678-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.257615 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec488a3f-cd31-4c53-817e-22c302ab7678-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.262159 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec488a3f-cd31-4c53-817e-22c302ab7678-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.262324 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-589wk\" (UniqueName: \"kubernetes.io/projected/ec488a3f-cd31-4c53-817e-22c302ab7678-kube-api-access-589wk\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.272858 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.273385 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ec488a3f-cd31-4c53-817e-22c302ab7678-secrets\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.295286 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ec78ab7d-75a5-4e60-9650-53cdee9213f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ec78ab7d-75a5-4e60-9650-53cdee9213f4\") pod \"openstack-galera-0\" (UID: \"ec488a3f-cd31-4c53-817e-22c302ab7678\") " pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.335029 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.543031 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.544083 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.549027 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.549248 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.549358 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-p882v" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.602515 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.645820 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-config-data\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.646087 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.646146 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.646165 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh6b2\" (UniqueName: \"kubernetes.io/projected/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-kube-api-access-zh6b2\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.646180 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-kolla-config\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.747341 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.748454 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh6b2\" (UniqueName: \"kubernetes.io/projected/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-kube-api-access-zh6b2\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.748547 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-kolla-config\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.748670 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-config-data\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.748743 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.749516 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-config-data\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.750171 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-kolla-config\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.753432 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.755811 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.765065 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d5e94c-dbb8-4aa4-b416-8b84137155ff" path="/var/lib/kubelet/pods/64d5e94c-dbb8-4aa4-b416-8b84137155ff/volumes" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.765489 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh6b2\" (UniqueName: \"kubernetes.io/projected/3e5b137b-69eb-4e58-98fb-d7a4afe639c8-kube-api-access-zh6b2\") pod \"memcached-0\" (UID: \"3e5b137b-69eb-4e58-98fb-d7a4afe639c8\") " pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.907443 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 29 20:05:26 crc kubenswrapper[4780]: I0929 20:05:26.982694 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 29 20:05:27 crc kubenswrapper[4780]: I0929 20:05:27.175142 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98e8d940-9717-43b2-9919-f12c4218b3f4","Type":"ContainerStarted","Data":"4c1f12f4725f14938edf412d0b19bd1ac3da5f13245b13cb519de2498776ab8e"} Sep 29 20:05:27 crc kubenswrapper[4780]: I0929 20:05:27.175479 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98e8d940-9717-43b2-9919-f12c4218b3f4","Type":"ContainerStarted","Data":"1e8ade822d7c2eb583ff928f2c550fda091081f61dd02be4bd7ff2d74bcf1895"} Sep 29 20:05:27 crc kubenswrapper[4780]: I0929 20:05:27.176702 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ec488a3f-cd31-4c53-817e-22c302ab7678","Type":"ContainerStarted","Data":"04d8e5a1aa6564aeef06602a024d4f77901a6d51d67aad817787617707ae0c9e"} Sep 29 20:05:27 crc kubenswrapper[4780]: I0929 20:05:27.176745 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ec488a3f-cd31-4c53-817e-22c302ab7678","Type":"ContainerStarted","Data":"5b15adbc6b004b410b8c23cee9575a1c6acbd80e485ef2706f92582de783c1df"} Sep 29 20:05:27 crc kubenswrapper[4780]: I0929 20:05:27.419857 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 29 20:05:27 crc kubenswrapper[4780]: W0929 20:05:27.420361 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e5b137b_69eb_4e58_98fb_d7a4afe639c8.slice/crio-daf8df8b71c4a2b3657327a1ab715f8936573256c8b1248a20d422d29ed2c8a2 WatchSource:0}: Error finding container daf8df8b71c4a2b3657327a1ab715f8936573256c8b1248a20d422d29ed2c8a2: Status 404 returned error can't find the container with id daf8df8b71c4a2b3657327a1ab715f8936573256c8b1248a20d422d29ed2c8a2 Sep 29 20:05:28 crc kubenswrapper[4780]: I0929 20:05:28.186528 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3e5b137b-69eb-4e58-98fb-d7a4afe639c8","Type":"ContainerStarted","Data":"122a62ade388fb7a1ae7ef84d6638b3882f091462eb30bfdd32a61b2c9259377"} Sep 29 20:05:28 crc kubenswrapper[4780]: I0929 20:05:28.186599 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3e5b137b-69eb-4e58-98fb-d7a4afe639c8","Type":"ContainerStarted","Data":"daf8df8b71c4a2b3657327a1ab715f8936573256c8b1248a20d422d29ed2c8a2"} Sep 29 20:05:28 crc kubenswrapper[4780]: I0929 20:05:28.214193 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.214175296 podStartE2EDuration="2.214175296s" podCreationTimestamp="2025-09-29 20:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:05:28.208499073 +0000 UTC m=+4928.156797117" watchObservedRunningTime="2025-09-29 20:05:28.214175296 +0000 UTC m=+4928.162473340" Sep 29 20:05:29 crc kubenswrapper[4780]: I0929 20:05:29.198856 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 29 20:05:30 crc kubenswrapper[4780]: I0929 20:05:30.211772 4780 generic.go:334] "Generic (PLEG): container finished" podID="98e8d940-9717-43b2-9919-f12c4218b3f4" containerID="4c1f12f4725f14938edf412d0b19bd1ac3da5f13245b13cb519de2498776ab8e" exitCode=0 Sep 29 20:05:30 crc kubenswrapper[4780]: I0929 20:05:30.211847 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98e8d940-9717-43b2-9919-f12c4218b3f4","Type":"ContainerDied","Data":"4c1f12f4725f14938edf412d0b19bd1ac3da5f13245b13cb519de2498776ab8e"} Sep 29 20:05:31 crc kubenswrapper[4780]: I0929 20:05:31.226376 4780 generic.go:334] "Generic (PLEG): container finished" podID="ec488a3f-cd31-4c53-817e-22c302ab7678" containerID="04d8e5a1aa6564aeef06602a024d4f77901a6d51d67aad817787617707ae0c9e" exitCode=0 Sep 29 20:05:31 crc kubenswrapper[4780]: I0929 20:05:31.226446 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ec488a3f-cd31-4c53-817e-22c302ab7678","Type":"ContainerDied","Data":"04d8e5a1aa6564aeef06602a024d4f77901a6d51d67aad817787617707ae0c9e"} Sep 29 20:05:31 crc kubenswrapper[4780]: I0929 20:05:31.230672 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98e8d940-9717-43b2-9919-f12c4218b3f4","Type":"ContainerStarted","Data":"0ddf151cdb309b349e96931012b9f6733c5749d59ff83af4bbfaa6fc9a4a868c"} Sep 29 20:05:31 crc kubenswrapper[4780]: I0929 20:05:31.293429 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.29341073 podStartE2EDuration="7.29341073s" podCreationTimestamp="2025-09-29 20:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:05:31.292083862 +0000 UTC m=+4931.240381916" watchObservedRunningTime="2025-09-29 20:05:31.29341073 +0000 UTC m=+4931.241708774" Sep 29 20:05:32 crc kubenswrapper[4780]: I0929 20:05:32.246139 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ec488a3f-cd31-4c53-817e-22c302ab7678","Type":"ContainerStarted","Data":"542f8da6b74f6fa8a5af61d8b6b68707218c602f5fc123fcdbaa10457fc49c9a"} Sep 29 20:05:32 crc kubenswrapper[4780]: I0929 20:05:32.282499 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.282473145 podStartE2EDuration="8.282473145s" podCreationTimestamp="2025-09-29 20:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:05:32.279400637 +0000 UTC m=+4932.227698721" watchObservedRunningTime="2025-09-29 20:05:32.282473145 +0000 UTC m=+4932.230771219" Sep 29 20:05:32 crc kubenswrapper[4780]: I0929 20:05:32.801366 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:33 crc kubenswrapper[4780]: I0929 20:05:33.204184 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:05:33 crc kubenswrapper[4780]: I0929 20:05:33.223446 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:05:33 crc kubenswrapper[4780]: I0929 20:05:33.223529 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:05:33 crc kubenswrapper[4780]: I0929 20:05:33.274813 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-rdx5g"] Sep 29 20:05:33 crc kubenswrapper[4780]: I0929 20:05:33.275238 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" podUID="ce7f542b-0e9c-4976-b537-20a916c01d27" containerName="dnsmasq-dns" containerID="cri-o://06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d" gracePeriod=10 Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.198513 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.266648 4780 generic.go:334] "Generic (PLEG): container finished" podID="ce7f542b-0e9c-4976-b537-20a916c01d27" containerID="06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d" exitCode=0 Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.266698 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" event={"ID":"ce7f542b-0e9c-4976-b537-20a916c01d27","Type":"ContainerDied","Data":"06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d"} Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.266728 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" event={"ID":"ce7f542b-0e9c-4976-b537-20a916c01d27","Type":"ContainerDied","Data":"2451ad2b6e3154a4c769d298477ddb78f5326a46010026418839952a131f826b"} Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.266748 4780 scope.go:117] "RemoveContainer" containerID="06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.266903 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7bd85-rdx5g" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.289848 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-dns-svc\") pod \"ce7f542b-0e9c-4976-b537-20a916c01d27\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.290947 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-config\") pod \"ce7f542b-0e9c-4976-b537-20a916c01d27\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.291080 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8xph\" (UniqueName: \"kubernetes.io/projected/ce7f542b-0e9c-4976-b537-20a916c01d27-kube-api-access-p8xph\") pod \"ce7f542b-0e9c-4976-b537-20a916c01d27\" (UID: \"ce7f542b-0e9c-4976-b537-20a916c01d27\") " Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.309349 4780 scope.go:117] "RemoveContainer" containerID="41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.309995 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7f542b-0e9c-4976-b537-20a916c01d27-kube-api-access-p8xph" (OuterVolumeSpecName: "kube-api-access-p8xph") pod "ce7f542b-0e9c-4976-b537-20a916c01d27" (UID: "ce7f542b-0e9c-4976-b537-20a916c01d27"). InnerVolumeSpecName "kube-api-access-p8xph". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.369237 4780 scope.go:117] "RemoveContainer" containerID="06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d" Sep 29 20:05:34 crc kubenswrapper[4780]: E0929 20:05:34.369712 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d\": container with ID starting with 06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d not found: ID does not exist" containerID="06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.369827 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d"} err="failed to get container status \"06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d\": rpc error: code = NotFound desc = could not find container \"06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d\": container with ID starting with 06be5bf07cd6122e70fb21b5f8226f4e31a1e1d9515dd2aea2158c7fc55a645d not found: ID does not exist" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.369897 4780 scope.go:117] "RemoveContainer" containerID="41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c" Sep 29 20:05:34 crc kubenswrapper[4780]: E0929 20:05:34.370207 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c\": container with ID starting with 41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c not found: ID does not exist" containerID="41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.370277 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c"} err="failed to get container status \"41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c\": rpc error: code = NotFound desc = could not find container \"41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c\": container with ID starting with 41410c7b0072764d67a33199a4b334405ed23217d89d6f77665aab26791ad23c not found: ID does not exist" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.382247 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce7f542b-0e9c-4976-b537-20a916c01d27" (UID: "ce7f542b-0e9c-4976-b537-20a916c01d27"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.392467 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.392630 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8xph\" (UniqueName: \"kubernetes.io/projected/ce7f542b-0e9c-4976-b537-20a916c01d27-kube-api-access-p8xph\") on node \"crc\" DevicePath \"\"" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.406778 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-config" (OuterVolumeSpecName: "config") pod "ce7f542b-0e9c-4976-b537-20a916c01d27" (UID: "ce7f542b-0e9c-4976-b537-20a916c01d27"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.493506 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7f542b-0e9c-4976-b537-20a916c01d27-config\") on node \"crc\" DevicePath \"\"" Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.601635 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-rdx5g"] Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.612548 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc7bd85-rdx5g"] Sep 29 20:05:34 crc kubenswrapper[4780]: I0929 20:05:34.762747 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce7f542b-0e9c-4976-b537-20a916c01d27" path="/var/lib/kubelet/pods/ce7f542b-0e9c-4976-b537-20a916c01d27/volumes" Sep 29 20:05:35 crc kubenswrapper[4780]: I0929 20:05:35.961587 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:35 crc kubenswrapper[4780]: I0929 20:05:35.961902 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:36 crc kubenswrapper[4780]: I0929 20:05:36.338297 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 29 20:05:36 crc kubenswrapper[4780]: I0929 20:05:36.338397 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 29 20:05:36 crc kubenswrapper[4780]: I0929 20:05:36.909193 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 29 20:05:38 crc kubenswrapper[4780]: I0929 20:05:38.045450 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:38 crc kubenswrapper[4780]: I0929 20:05:38.097591 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 29 20:05:38 crc kubenswrapper[4780]: I0929 20:05:38.422609 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 29 20:05:38 crc kubenswrapper[4780]: I0929 20:05:38.486496 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 29 20:05:56 crc kubenswrapper[4780]: I0929 20:05:56.466942 4780 generic.go:334] "Generic (PLEG): container finished" podID="bb526eee-bf93-413b-af6b-d851daf166e0" containerID="10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6" exitCode=0 Sep 29 20:05:56 crc kubenswrapper[4780]: I0929 20:05:56.467029 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb526eee-bf93-413b-af6b-d851daf166e0","Type":"ContainerDied","Data":"10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6"} Sep 29 20:05:56 crc kubenswrapper[4780]: I0929 20:05:56.470492 4780 generic.go:334] "Generic (PLEG): container finished" podID="4009b891-9e80-40fb-8205-eddb545b5424" containerID="92757498b50f98edc1a9df4bd51360d9991efe26e6409aad81aa0557e0bc67cb" exitCode=0 Sep 29 20:05:56 crc kubenswrapper[4780]: I0929 20:05:56.470538 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4009b891-9e80-40fb-8205-eddb545b5424","Type":"ContainerDied","Data":"92757498b50f98edc1a9df4bd51360d9991efe26e6409aad81aa0557e0bc67cb"} Sep 29 20:05:57 crc kubenswrapper[4780]: I0929 20:05:57.480990 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb526eee-bf93-413b-af6b-d851daf166e0","Type":"ContainerStarted","Data":"2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370"} Sep 29 20:05:57 crc kubenswrapper[4780]: I0929 20:05:57.481775 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:05:57 crc kubenswrapper[4780]: I0929 20:05:57.483710 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4009b891-9e80-40fb-8205-eddb545b5424","Type":"ContainerStarted","Data":"6856f336587debb5533c741a442609522bdbc048e7eb5bd28b4078f0298d8be2"} Sep 29 20:05:57 crc kubenswrapper[4780]: I0929 20:05:57.483914 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 29 20:05:57 crc kubenswrapper[4780]: I0929 20:05:57.506537 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.506506914 podStartE2EDuration="35.506506914s" podCreationTimestamp="2025-09-29 20:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:05:57.503312402 +0000 UTC m=+4957.451610456" watchObservedRunningTime="2025-09-29 20:05:57.506506914 +0000 UTC m=+4957.454804978" Sep 29 20:05:57 crc kubenswrapper[4780]: I0929 20:05:57.542389 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.542368621 podStartE2EDuration="35.542368621s" podCreationTimestamp="2025-09-29 20:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:05:57.539288753 +0000 UTC m=+4957.487586807" watchObservedRunningTime="2025-09-29 20:05:57.542368621 +0000 UTC m=+4957.490666685" Sep 29 20:06:03 crc kubenswrapper[4780]: I0929 20:06:03.223989 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:06:03 crc kubenswrapper[4780]: I0929 20:06:03.225130 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:06:13 crc kubenswrapper[4780]: I0929 20:06:13.929215 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 29 20:06:14 crc kubenswrapper[4780]: I0929 20:06:14.628316 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:17 crc kubenswrapper[4780]: E0929 20:06:17.249530 4780 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.80:60134->38.102.83.80:37067: write tcp 38.102.83.80:60134->38.102.83.80:37067: write: broken pipe Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.137629 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-t8smt"] Sep 29 20:06:23 crc kubenswrapper[4780]: E0929 20:06:23.138733 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7f542b-0e9c-4976-b537-20a916c01d27" containerName="dnsmasq-dns" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.138752 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7f542b-0e9c-4976-b537-20a916c01d27" containerName="dnsmasq-dns" Sep 29 20:06:23 crc kubenswrapper[4780]: E0929 20:06:23.138767 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7f542b-0e9c-4976-b537-20a916c01d27" containerName="init" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.138775 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7f542b-0e9c-4976-b537-20a916c01d27" containerName="init" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.138957 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce7f542b-0e9c-4976-b537-20a916c01d27" containerName="dnsmasq-dns" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.140202 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.164159 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-t8smt"] Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.273637 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjdvf\" (UniqueName: \"kubernetes.io/projected/c925558e-f918-4b3c-be41-38f20faeefea-kube-api-access-mjdvf\") pod \"dnsmasq-dns-6885566dd9-t8smt\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.273693 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-config\") pod \"dnsmasq-dns-6885566dd9-t8smt\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.273829 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-dns-svc\") pod \"dnsmasq-dns-6885566dd9-t8smt\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.375203 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-dns-svc\") pod \"dnsmasq-dns-6885566dd9-t8smt\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.375471 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjdvf\" (UniqueName: \"kubernetes.io/projected/c925558e-f918-4b3c-be41-38f20faeefea-kube-api-access-mjdvf\") pod \"dnsmasq-dns-6885566dd9-t8smt\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.375565 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-config\") pod \"dnsmasq-dns-6885566dd9-t8smt\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.376199 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-dns-svc\") pod \"dnsmasq-dns-6885566dd9-t8smt\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.377023 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-config\") pod \"dnsmasq-dns-6885566dd9-t8smt\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.406940 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjdvf\" (UniqueName: \"kubernetes.io/projected/c925558e-f918-4b3c-be41-38f20faeefea-kube-api-access-mjdvf\") pod \"dnsmasq-dns-6885566dd9-t8smt\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.462263 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.835006 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 20:06:23 crc kubenswrapper[4780]: I0929 20:06:23.958741 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-t8smt"] Sep 29 20:06:24 crc kubenswrapper[4780]: W0929 20:06:24.033130 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc925558e_f918_4b3c_be41_38f20faeefea.slice/crio-15aa768be84ef30b9a7cb9367176e0fed1d7b6540bba3eef00629d400a41f323 WatchSource:0}: Error finding container 15aa768be84ef30b9a7cb9367176e0fed1d7b6540bba3eef00629d400a41f323: Status 404 returned error can't find the container with id 15aa768be84ef30b9a7cb9367176e0fed1d7b6540bba3eef00629d400a41f323 Sep 29 20:06:24 crc kubenswrapper[4780]: I0929 20:06:24.599784 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 20:06:24 crc kubenswrapper[4780]: I0929 20:06:24.744795 4780 generic.go:334] "Generic (PLEG): container finished" podID="c925558e-f918-4b3c-be41-38f20faeefea" containerID="8783a37f331e28ea26f34b89916d013f5d27ab71e128f5f7e9fa1b3ef37158eb" exitCode=0 Sep 29 20:06:24 crc kubenswrapper[4780]: I0929 20:06:24.744846 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" event={"ID":"c925558e-f918-4b3c-be41-38f20faeefea","Type":"ContainerDied","Data":"8783a37f331e28ea26f34b89916d013f5d27ab71e128f5f7e9fa1b3ef37158eb"} Sep 29 20:06:24 crc kubenswrapper[4780]: I0929 20:06:24.744876 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" event={"ID":"c925558e-f918-4b3c-be41-38f20faeefea","Type":"ContainerStarted","Data":"15aa768be84ef30b9a7cb9367176e0fed1d7b6540bba3eef00629d400a41f323"} Sep 29 20:06:25 crc kubenswrapper[4780]: I0929 20:06:25.754197 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" event={"ID":"c925558e-f918-4b3c-be41-38f20faeefea","Type":"ContainerStarted","Data":"a63c54af105e15fbbff097c399930b624f738987943240c6f830e15b3a8baa36"} Sep 29 20:06:25 crc kubenswrapper[4780]: I0929 20:06:25.754497 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:25 crc kubenswrapper[4780]: I0929 20:06:25.781355 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" podStartSLOduration=2.7813391210000002 podStartE2EDuration="2.781339121s" podCreationTimestamp="2025-09-29 20:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:06:25.774253108 +0000 UTC m=+4985.722551152" watchObservedRunningTime="2025-09-29 20:06:25.781339121 +0000 UTC m=+4985.729637165" Sep 29 20:06:28 crc kubenswrapper[4780]: I0929 20:06:28.185316 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4009b891-9e80-40fb-8205-eddb545b5424" containerName="rabbitmq" containerID="cri-o://6856f336587debb5533c741a442609522bdbc048e7eb5bd28b4078f0298d8be2" gracePeriod=604796 Sep 29 20:06:29 crc kubenswrapper[4780]: I0929 20:06:29.094847 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bb526eee-bf93-413b-af6b-d851daf166e0" containerName="rabbitmq" containerID="cri-o://2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370" gracePeriod=604796 Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.223371 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.223707 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.223764 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.224594 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.224744 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" gracePeriod=600 Sep 29 20:06:33 crc kubenswrapper[4780]: E0929 20:06:33.460285 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.464380 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.547655 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-q98wd"] Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.548816 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" podUID="0037fee6-408c-46d7-a95d-1028921f3f0f" containerName="dnsmasq-dns" containerID="cri-o://5b218c9d686a5e37dc9e79a0f4d7cf1bbb8d79f5969175ddd4d313762a2b53f4" gracePeriod=10 Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.848358 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" exitCode=0 Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.848413 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af"} Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.848466 4780 scope.go:117] "RemoveContainer" containerID="dfb132c59b3079e843ab5ef45babfca3451e0db3c6b14052465d4ce66e9d1239" Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.849346 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:06:33 crc kubenswrapper[4780]: E0929 20:06:33.849736 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.854463 4780 generic.go:334] "Generic (PLEG): container finished" podID="0037fee6-408c-46d7-a95d-1028921f3f0f" containerID="5b218c9d686a5e37dc9e79a0f4d7cf1bbb8d79f5969175ddd4d313762a2b53f4" exitCode=0 Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.854506 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" event={"ID":"0037fee6-408c-46d7-a95d-1028921f3f0f","Type":"ContainerDied","Data":"5b218c9d686a5e37dc9e79a0f4d7cf1bbb8d79f5969175ddd4d313762a2b53f4"} Sep 29 20:06:33 crc kubenswrapper[4780]: I0929 20:06:33.926886 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4009b891-9e80-40fb-8205-eddb545b5424" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.241:5671: connect: connection refused" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.054693 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.163915 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzxs5\" (UniqueName: \"kubernetes.io/projected/0037fee6-408c-46d7-a95d-1028921f3f0f-kube-api-access-kzxs5\") pod \"0037fee6-408c-46d7-a95d-1028921f3f0f\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.164036 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-config\") pod \"0037fee6-408c-46d7-a95d-1028921f3f0f\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.164282 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-dns-svc\") pod \"0037fee6-408c-46d7-a95d-1028921f3f0f\" (UID: \"0037fee6-408c-46d7-a95d-1028921f3f0f\") " Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.172217 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0037fee6-408c-46d7-a95d-1028921f3f0f-kube-api-access-kzxs5" (OuterVolumeSpecName: "kube-api-access-kzxs5") pod "0037fee6-408c-46d7-a95d-1028921f3f0f" (UID: "0037fee6-408c-46d7-a95d-1028921f3f0f"). InnerVolumeSpecName "kube-api-access-kzxs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.201127 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0037fee6-408c-46d7-a95d-1028921f3f0f" (UID: "0037fee6-408c-46d7-a95d-1028921f3f0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.222397 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-config" (OuterVolumeSpecName: "config") pod "0037fee6-408c-46d7-a95d-1028921f3f0f" (UID: "0037fee6-408c-46d7-a95d-1028921f3f0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.266891 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.266938 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzxs5\" (UniqueName: \"kubernetes.io/projected/0037fee6-408c-46d7-a95d-1028921f3f0f-kube-api-access-kzxs5\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.266972 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0037fee6-408c-46d7-a95d-1028921f3f0f-config\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.625777 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bb526eee-bf93-413b-af6b-d851daf166e0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.242:5671: connect: connection refused" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.867428 4780 generic.go:334] "Generic (PLEG): container finished" podID="4009b891-9e80-40fb-8205-eddb545b5424" containerID="6856f336587debb5533c741a442609522bdbc048e7eb5bd28b4078f0298d8be2" exitCode=0 Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.867557 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4009b891-9e80-40fb-8205-eddb545b5424","Type":"ContainerDied","Data":"6856f336587debb5533c741a442609522bdbc048e7eb5bd28b4078f0298d8be2"} Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.867628 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4009b891-9e80-40fb-8205-eddb545b5424","Type":"ContainerDied","Data":"c13d7ba4f7b1f57b7281c725bd339adecfa9d320b8451003c6cc14539de7ea9b"} Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.867652 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c13d7ba4f7b1f57b7281c725bd339adecfa9d320b8451003c6cc14539de7ea9b" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.870836 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" event={"ID":"0037fee6-408c-46d7-a95d-1028921f3f0f","Type":"ContainerDied","Data":"58868b19a17e7333f55d77f925738ce3023efa72d97ba39720085a6d6260ae85"} Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.870884 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f455d6d69-q98wd" Sep 29 20:06:34 crc kubenswrapper[4780]: I0929 20:06:34.870901 4780 scope.go:117] "RemoveContainer" containerID="5b218c9d686a5e37dc9e79a0f4d7cf1bbb8d79f5969175ddd4d313762a2b53f4" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.419181 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.444532 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-q98wd"] Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.452824 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f455d6d69-q98wd"] Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.460858 4780 scope.go:117] "RemoveContainer" containerID="dc1247fb0bd8bdfa59323230295d5f5c7b1da19eceb7ea65df9c6ebf7f874939" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.592409 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4009b891-9e80-40fb-8205-eddb545b5424-pod-info\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.592574 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4009b891-9e80-40fb-8205-eddb545b5424-erlang-cookie-secret\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.592642 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-server-conf\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.592677 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-plugins\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.592887 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.592926 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-erlang-cookie\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.592945 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-config-data\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.593021 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9vw9\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-kube-api-access-z9vw9\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.593297 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-confd\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.593326 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-plugins-conf\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.593328 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.593404 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-tls\") pod \"4009b891-9e80-40fb-8205-eddb545b5424\" (UID: \"4009b891-9e80-40fb-8205-eddb545b5424\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.595672 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.595703 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.595734 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.598066 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.598291 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4009b891-9e80-40fb-8205-eddb545b5424-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.599690 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-kube-api-access-z9vw9" (OuterVolumeSpecName: "kube-api-access-z9vw9") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "kube-api-access-z9vw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.607348 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3" (OuterVolumeSpecName: "persistence") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.611650 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4009b891-9e80-40fb-8205-eddb545b5424-pod-info" (OuterVolumeSpecName: "pod-info") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.628183 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-config-data" (OuterVolumeSpecName: "config-data") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.668973 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-server-conf" (OuterVolumeSpecName: "server-conf") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.685637 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4009b891-9e80-40fb-8205-eddb545b5424" (UID: "4009b891-9e80-40fb-8205-eddb545b5424"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697460 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697487 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697499 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9vw9\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-kube-api-access-z9vw9\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697509 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697519 4780 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697530 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4009b891-9e80-40fb-8205-eddb545b5424-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697541 4780 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4009b891-9e80-40fb-8205-eddb545b5424-pod-info\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697552 4780 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4009b891-9e80-40fb-8205-eddb545b5424-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697560 4780 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4009b891-9e80-40fb-8205-eddb545b5424-server-conf\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.697595 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") on node \"crc\" " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.714993 4780 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.715160 4780 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3") on node "crc" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.719116 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798457 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb526eee-bf93-413b-af6b-d851daf166e0-erlang-cookie-secret\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798505 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49m4t\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-kube-api-access-49m4t\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798560 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-config-data\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798641 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-confd\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798749 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798786 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-server-conf\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798833 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-erlang-cookie\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798860 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb526eee-bf93-413b-af6b-d851daf166e0-pod-info\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798888 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-plugins-conf\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798929 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-plugins\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.798957 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-tls\") pod \"bb526eee-bf93-413b-af6b-d851daf166e0\" (UID: \"bb526eee-bf93-413b-af6b-d851daf166e0\") " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.799286 4780 reconciler_common.go:293] "Volume detached for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.800339 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.800900 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.801152 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.825508 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.826345 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb526eee-bf93-413b-af6b-d851daf166e0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.826813 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d" (OuterVolumeSpecName: "persistence") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "pvc-2b6ee439-1111-4e9b-af28-32310041724d". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.826925 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-kube-api-access-49m4t" (OuterVolumeSpecName: "kube-api-access-49m4t") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "kube-api-access-49m4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.827201 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bb526eee-bf93-413b-af6b-d851daf166e0-pod-info" (OuterVolumeSpecName: "pod-info") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.827436 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-config-data" (OuterVolumeSpecName: "config-data") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.858121 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-server-conf" (OuterVolumeSpecName: "server-conf") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.891531 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bb526eee-bf93-413b-af6b-d851daf166e0" (UID: "bb526eee-bf93-413b-af6b-d851daf166e0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900871 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900914 4780 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") on node \"crc\" " Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900926 4780 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-server-conf\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900940 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900952 4780 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb526eee-bf93-413b-af6b-d851daf166e0-pod-info\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900961 4780 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900972 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900980 4780 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900989 4780 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb526eee-bf93-413b-af6b-d851daf166e0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.900998 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49m4t\" (UniqueName: \"kubernetes.io/projected/bb526eee-bf93-413b-af6b-d851daf166e0-kube-api-access-49m4t\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.901107 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb526eee-bf93-413b-af6b-d851daf166e0-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.904655 4780 generic.go:334] "Generic (PLEG): container finished" podID="bb526eee-bf93-413b-af6b-d851daf166e0" containerID="2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370" exitCode=0 Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.904823 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb526eee-bf93-413b-af6b-d851daf166e0","Type":"ContainerDied","Data":"2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370"} Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.904912 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb526eee-bf93-413b-af6b-d851daf166e0","Type":"ContainerDied","Data":"36bbfdd4477bab3ab503141494a76ef0cab8a5b33d8aeb0417a36694e21e443f"} Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.904854 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.905017 4780 scope.go:117] "RemoveContainer" containerID="2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.906653 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.918823 4780 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.918960 4780 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2b6ee439-1111-4e9b-af28-32310041724d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d") on node "crc" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.929004 4780 scope.go:117] "RemoveContainer" containerID="10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.950456 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.956004 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.962473 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.970769 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.994829 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 20:06:35 crc kubenswrapper[4780]: E0929 20:06:35.995696 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0037fee6-408c-46d7-a95d-1028921f3f0f" containerName="init" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.995724 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="0037fee6-408c-46d7-a95d-1028921f3f0f" containerName="init" Sep 29 20:06:35 crc kubenswrapper[4780]: E0929 20:06:35.995743 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4009b891-9e80-40fb-8205-eddb545b5424" containerName="setup-container" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.995750 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4009b891-9e80-40fb-8205-eddb545b5424" containerName="setup-container" Sep 29 20:06:35 crc kubenswrapper[4780]: E0929 20:06:35.995770 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb526eee-bf93-413b-af6b-d851daf166e0" containerName="setup-container" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.995777 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb526eee-bf93-413b-af6b-d851daf166e0" containerName="setup-container" Sep 29 20:06:35 crc kubenswrapper[4780]: E0929 20:06:35.995788 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0037fee6-408c-46d7-a95d-1028921f3f0f" containerName="dnsmasq-dns" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.995794 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="0037fee6-408c-46d7-a95d-1028921f3f0f" containerName="dnsmasq-dns" Sep 29 20:06:35 crc kubenswrapper[4780]: E0929 20:06:35.995807 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb526eee-bf93-413b-af6b-d851daf166e0" containerName="rabbitmq" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.995813 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb526eee-bf93-413b-af6b-d851daf166e0" containerName="rabbitmq" Sep 29 20:06:35 crc kubenswrapper[4780]: E0929 20:06:35.995826 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4009b891-9e80-40fb-8205-eddb545b5424" containerName="rabbitmq" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.995831 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4009b891-9e80-40fb-8205-eddb545b5424" containerName="rabbitmq" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.996020 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="0037fee6-408c-46d7-a95d-1028921f3f0f" containerName="dnsmasq-dns" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.996054 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="4009b891-9e80-40fb-8205-eddb545b5424" containerName="rabbitmq" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.996065 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb526eee-bf93-413b-af6b-d851daf166e0" containerName="rabbitmq" Sep 29 20:06:35 crc kubenswrapper[4780]: I0929 20:06:35.997980 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.001900 4780 scope.go:117] "RemoveContainer" containerID="2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.002270 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.002350 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.002381 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.002551 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.002757 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g5jlk" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.002984 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.003273 4780 reconciler_common.go:293] "Volume detached for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:36 crc kubenswrapper[4780]: E0929 20:06:36.004083 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370\": container with ID starting with 2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370 not found: ID does not exist" containerID="2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.004124 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370"} err="failed to get container status \"2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370\": rpc error: code = NotFound desc = could not find container \"2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370\": container with ID starting with 2d451acb5f182ee57e709ceb4d8d86117f5808344de6fcb91b5b8f809b2c4370 not found: ID does not exist" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.004154 4780 scope.go:117] "RemoveContainer" containerID="10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6" Sep 29 20:06:36 crc kubenswrapper[4780]: E0929 20:06:36.004664 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6\": container with ID starting with 10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6 not found: ID does not exist" containerID="10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.004903 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6"} err="failed to get container status \"10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6\": rpc error: code = NotFound desc = could not find container \"10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6\": container with ID starting with 10144419d3cfa06be98ac719c08f65291a809bcd1e41634d6d403527913fecd6 not found: ID does not exist" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.006621 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.008248 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.013919 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.014035 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.021964 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.022523 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.024315 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.024649 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.025171 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.025371 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.025522 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tnlhl" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.027504 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.104835 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.104888 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.104920 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105016 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105156 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105226 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105310 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105363 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105423 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105460 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105508 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105576 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105680 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105725 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105756 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105807 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105856 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmghw\" (UniqueName: \"kubernetes.io/projected/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-kube-api-access-fmghw\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105882 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105904 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlxn2\" (UniqueName: \"kubernetes.io/projected/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-kube-api-access-zlxn2\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.105988 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.106012 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.106107 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208179 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmghw\" (UniqueName: \"kubernetes.io/projected/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-kube-api-access-fmghw\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208258 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208282 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlxn2\" (UniqueName: \"kubernetes.io/projected/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-kube-api-access-zlxn2\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208364 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208392 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208434 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208480 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208512 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208546 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208584 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208620 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208652 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208687 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208716 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208746 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208770 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208800 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208832 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208872 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208910 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208933 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.208964 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.209428 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.209909 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.210734 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.210940 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.211217 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.211599 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.211759 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.212635 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.213011 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.213493 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.214968 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.215001 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0e285308d095dbc190ec4eda8d58fb5c556ca750eaac2fbd6c0192ddeec8db59/globalmount\"" pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.215538 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.215718 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4945b6305221935745bdf4e6c1d62ba3d9fd2f307f257b88ecb8f238eb891f49/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.216237 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.217605 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.217893 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.219092 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.219553 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.219700 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.220491 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.230485 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.233088 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlxn2\" (UniqueName: \"kubernetes.io/projected/82d758e0-ddd1-4c96-bfa9-bd81f14359ac-kube-api-access-zlxn2\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.233961 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmghw\" (UniqueName: \"kubernetes.io/projected/dd09e83d-ac55-42fb-9d0c-b84c5d12c284-kube-api-access-fmghw\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.270573 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2b6ee439-1111-4e9b-af28-32310041724d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2b6ee439-1111-4e9b-af28-32310041724d\") pod \"rabbitmq-cell1-server-0\" (UID: \"dd09e83d-ac55-42fb-9d0c-b84c5d12c284\") " pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.273606 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d5e3cba-1090-42b3-9716-76aa57a3f1d3\") pod \"rabbitmq-server-0\" (UID: \"82d758e0-ddd1-4c96-bfa9-bd81f14359ac\") " pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.340833 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.349868 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.657568 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.723871 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 29 20:06:36 crc kubenswrapper[4780]: W0929 20:06:36.727843 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82d758e0_ddd1_4c96_bfa9_bd81f14359ac.slice/crio-ee0683c25730773e30ba1cfb2523fc1754cefe8d27683b48ad25e2bfee4af97a WatchSource:0}: Error finding container ee0683c25730773e30ba1cfb2523fc1754cefe8d27683b48ad25e2bfee4af97a: Status 404 returned error can't find the container with id ee0683c25730773e30ba1cfb2523fc1754cefe8d27683b48ad25e2bfee4af97a Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.763959 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0037fee6-408c-46d7-a95d-1028921f3f0f" path="/var/lib/kubelet/pods/0037fee6-408c-46d7-a95d-1028921f3f0f/volumes" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.764977 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4009b891-9e80-40fb-8205-eddb545b5424" path="/var/lib/kubelet/pods/4009b891-9e80-40fb-8205-eddb545b5424/volumes" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.766195 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb526eee-bf93-413b-af6b-d851daf166e0" path="/var/lib/kubelet/pods/bb526eee-bf93-413b-af6b-d851daf166e0/volumes" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.817912 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b2d4j"] Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.820683 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.824630 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2d4j"] Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.917133 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dd09e83d-ac55-42fb-9d0c-b84c5d12c284","Type":"ContainerStarted","Data":"d9a4ff377abdbc5f526f4ebc2b8607b224ea2b9bd0b7d15b8a18aa9537b75be7"} Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.919366 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82d758e0-ddd1-4c96-bfa9-bd81f14359ac","Type":"ContainerStarted","Data":"ee0683c25730773e30ba1cfb2523fc1754cefe8d27683b48ad25e2bfee4af97a"} Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.928975 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-catalog-content\") pod \"redhat-marketplace-b2d4j\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.929121 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrj6\" (UniqueName: \"kubernetes.io/projected/fab06f91-ae72-4c71-b69f-f951de4c0ed3-kube-api-access-cnrj6\") pod \"redhat-marketplace-b2d4j\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:36 crc kubenswrapper[4780]: I0929 20:06:36.929274 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-utilities\") pod \"redhat-marketplace-b2d4j\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.030666 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-utilities\") pod \"redhat-marketplace-b2d4j\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.030763 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-catalog-content\") pod \"redhat-marketplace-b2d4j\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.030804 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrj6\" (UniqueName: \"kubernetes.io/projected/fab06f91-ae72-4c71-b69f-f951de4c0ed3-kube-api-access-cnrj6\") pod \"redhat-marketplace-b2d4j\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.031641 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-utilities\") pod \"redhat-marketplace-b2d4j\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.031704 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-catalog-content\") pod \"redhat-marketplace-b2d4j\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.052193 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrj6\" (UniqueName: \"kubernetes.io/projected/fab06f91-ae72-4c71-b69f-f951de4c0ed3-kube-api-access-cnrj6\") pod \"redhat-marketplace-b2d4j\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.224574 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.714082 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2d4j"] Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.932095 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dd09e83d-ac55-42fb-9d0c-b84c5d12c284","Type":"ContainerStarted","Data":"a117706119c241b88b55854b330a8120c2e59d02cd0146afa26f142dd222fa63"} Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.938401 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82d758e0-ddd1-4c96-bfa9-bd81f14359ac","Type":"ContainerStarted","Data":"be8f33541e34e94ac21b625277ec2d3dfe0b5d60e10a6b630496aeecad148a0a"} Sep 29 20:06:37 crc kubenswrapper[4780]: I0929 20:06:37.940994 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2d4j" event={"ID":"fab06f91-ae72-4c71-b69f-f951de4c0ed3","Type":"ContainerStarted","Data":"3340176b010da30720793df22888a38fe0f96af52d27f7702bf1f97934860cb1"} Sep 29 20:06:38 crc kubenswrapper[4780]: I0929 20:06:38.954154 4780 generic.go:334] "Generic (PLEG): container finished" podID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerID="2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247" exitCode=0 Sep 29 20:06:38 crc kubenswrapper[4780]: I0929 20:06:38.954562 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2d4j" event={"ID":"fab06f91-ae72-4c71-b69f-f951de4c0ed3","Type":"ContainerDied","Data":"2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247"} Sep 29 20:06:40 crc kubenswrapper[4780]: I0929 20:06:40.976556 4780 generic.go:334] "Generic (PLEG): container finished" podID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerID="db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51" exitCode=0 Sep 29 20:06:40 crc kubenswrapper[4780]: I0929 20:06:40.977573 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2d4j" event={"ID":"fab06f91-ae72-4c71-b69f-f951de4c0ed3","Type":"ContainerDied","Data":"db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51"} Sep 29 20:06:41 crc kubenswrapper[4780]: I0929 20:06:41.992083 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2d4j" event={"ID":"fab06f91-ae72-4c71-b69f-f951de4c0ed3","Type":"ContainerStarted","Data":"a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc"} Sep 29 20:06:42 crc kubenswrapper[4780]: I0929 20:06:42.026481 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b2d4j" podStartSLOduration=3.567743219 podStartE2EDuration="6.026454063s" podCreationTimestamp="2025-09-29 20:06:36 +0000 UTC" firstStartedPulling="2025-09-29 20:06:38.956956868 +0000 UTC m=+4998.905254942" lastFinishedPulling="2025-09-29 20:06:41.415667702 +0000 UTC m=+5001.363965786" observedRunningTime="2025-09-29 20:06:42.024658892 +0000 UTC m=+5001.972956946" watchObservedRunningTime="2025-09-29 20:06:42.026454063 +0000 UTC m=+5001.974752117" Sep 29 20:06:47 crc kubenswrapper[4780]: I0929 20:06:47.225841 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:47 crc kubenswrapper[4780]: I0929 20:06:47.226670 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:47 crc kubenswrapper[4780]: I0929 20:06:47.308834 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:48 crc kubenswrapper[4780]: I0929 20:06:48.491267 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:48 crc kubenswrapper[4780]: I0929 20:06:48.573341 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2d4j"] Sep 29 20:06:48 crc kubenswrapper[4780]: I0929 20:06:48.754134 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:06:48 crc kubenswrapper[4780]: E0929 20:06:48.754831 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.084319 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b2d4j" podUID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerName="registry-server" containerID="cri-o://a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc" gracePeriod=2 Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.558166 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.708260 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnrj6\" (UniqueName: \"kubernetes.io/projected/fab06f91-ae72-4c71-b69f-f951de4c0ed3-kube-api-access-cnrj6\") pod \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.708476 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-catalog-content\") pod \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.708550 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-utilities\") pod \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\" (UID: \"fab06f91-ae72-4c71-b69f-f951de4c0ed3\") " Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.710121 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-utilities" (OuterVolumeSpecName: "utilities") pod "fab06f91-ae72-4c71-b69f-f951de4c0ed3" (UID: "fab06f91-ae72-4c71-b69f-f951de4c0ed3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.718038 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab06f91-ae72-4c71-b69f-f951de4c0ed3-kube-api-access-cnrj6" (OuterVolumeSpecName: "kube-api-access-cnrj6") pod "fab06f91-ae72-4c71-b69f-f951de4c0ed3" (UID: "fab06f91-ae72-4c71-b69f-f951de4c0ed3"). InnerVolumeSpecName "kube-api-access-cnrj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.734303 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fab06f91-ae72-4c71-b69f-f951de4c0ed3" (UID: "fab06f91-ae72-4c71-b69f-f951de4c0ed3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.811202 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnrj6\" (UniqueName: \"kubernetes.io/projected/fab06f91-ae72-4c71-b69f-f951de4c0ed3-kube-api-access-cnrj6\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.811269 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:50 crc kubenswrapper[4780]: I0929 20:06:50.811301 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fab06f91-ae72-4c71-b69f-f951de4c0ed3-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.099785 4780 generic.go:334] "Generic (PLEG): container finished" podID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerID="a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc" exitCode=0 Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.099838 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2d4j" event={"ID":"fab06f91-ae72-4c71-b69f-f951de4c0ed3","Type":"ContainerDied","Data":"a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc"} Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.099888 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b2d4j" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.100005 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b2d4j" event={"ID":"fab06f91-ae72-4c71-b69f-f951de4c0ed3","Type":"ContainerDied","Data":"3340176b010da30720793df22888a38fe0f96af52d27f7702bf1f97934860cb1"} Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.100040 4780 scope.go:117] "RemoveContainer" containerID="a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.141379 4780 scope.go:117] "RemoveContainer" containerID="db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.156432 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2d4j"] Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.164487 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b2d4j"] Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.196628 4780 scope.go:117] "RemoveContainer" containerID="2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.231205 4780 scope.go:117] "RemoveContainer" containerID="a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc" Sep 29 20:06:51 crc kubenswrapper[4780]: E0929 20:06:51.231813 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc\": container with ID starting with a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc not found: ID does not exist" containerID="a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.231863 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc"} err="failed to get container status \"a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc\": rpc error: code = NotFound desc = could not find container \"a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc\": container with ID starting with a3ccc536d3d978baaa8ca75f46597403cbc4f0c51f500ba545478b887f05ebbc not found: ID does not exist" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.231894 4780 scope.go:117] "RemoveContainer" containerID="db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51" Sep 29 20:06:51 crc kubenswrapper[4780]: E0929 20:06:51.232731 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51\": container with ID starting with db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51 not found: ID does not exist" containerID="db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.232836 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51"} err="failed to get container status \"db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51\": rpc error: code = NotFound desc = could not find container \"db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51\": container with ID starting with db7223bdc1516a6105f750ca58a469bd972e3e4615d978379313f72b8f0efb51 not found: ID does not exist" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.232884 4780 scope.go:117] "RemoveContainer" containerID="2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247" Sep 29 20:06:51 crc kubenswrapper[4780]: E0929 20:06:51.233504 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247\": container with ID starting with 2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247 not found: ID does not exist" containerID="2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247" Sep 29 20:06:51 crc kubenswrapper[4780]: I0929 20:06:51.233545 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247"} err="failed to get container status \"2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247\": rpc error: code = NotFound desc = could not find container \"2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247\": container with ID starting with 2210413a2b0819ac7f29bc953d5e90d4075ffb5fda1054cdb1b1ecd98d3db247 not found: ID does not exist" Sep 29 20:06:52 crc kubenswrapper[4780]: I0929 20:06:52.772027 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" path="/var/lib/kubelet/pods/fab06f91-ae72-4c71-b69f-f951de4c0ed3/volumes" Sep 29 20:07:01 crc kubenswrapper[4780]: I0929 20:07:01.753659 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:07:01 crc kubenswrapper[4780]: E0929 20:07:01.754873 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:07:08 crc kubenswrapper[4780]: I0929 20:07:08.273761 4780 generic.go:334] "Generic (PLEG): container finished" podID="82d758e0-ddd1-4c96-bfa9-bd81f14359ac" containerID="be8f33541e34e94ac21b625277ec2d3dfe0b5d60e10a6b630496aeecad148a0a" exitCode=0 Sep 29 20:07:08 crc kubenswrapper[4780]: I0929 20:07:08.273831 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82d758e0-ddd1-4c96-bfa9-bd81f14359ac","Type":"ContainerDied","Data":"be8f33541e34e94ac21b625277ec2d3dfe0b5d60e10a6b630496aeecad148a0a"} Sep 29 20:07:08 crc kubenswrapper[4780]: I0929 20:07:08.278212 4780 generic.go:334] "Generic (PLEG): container finished" podID="dd09e83d-ac55-42fb-9d0c-b84c5d12c284" containerID="a117706119c241b88b55854b330a8120c2e59d02cd0146afa26f142dd222fa63" exitCode=0 Sep 29 20:07:08 crc kubenswrapper[4780]: I0929 20:07:08.278335 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dd09e83d-ac55-42fb-9d0c-b84c5d12c284","Type":"ContainerDied","Data":"a117706119c241b88b55854b330a8120c2e59d02cd0146afa26f142dd222fa63"} Sep 29 20:07:09 crc kubenswrapper[4780]: I0929 20:07:09.302445 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dd09e83d-ac55-42fb-9d0c-b84c5d12c284","Type":"ContainerStarted","Data":"1e57c23011620131e8f34dd941df349cff79623eab39e1e5bfc9b5a140608841"} Sep 29 20:07:09 crc kubenswrapper[4780]: I0929 20:07:09.304482 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:07:09 crc kubenswrapper[4780]: I0929 20:07:09.307811 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82d758e0-ddd1-4c96-bfa9-bd81f14359ac","Type":"ContainerStarted","Data":"82f2642e5f0a6f3c367537753cd317897ae9baa502f7d84b7a982723de5d5ee4"} Sep 29 20:07:09 crc kubenswrapper[4780]: I0929 20:07:09.308310 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 29 20:07:09 crc kubenswrapper[4780]: I0929 20:07:09.337731 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.337710297 podStartE2EDuration="34.337710297s" podCreationTimestamp="2025-09-29 20:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:07:09.332167908 +0000 UTC m=+5029.280465962" watchObservedRunningTime="2025-09-29 20:07:09.337710297 +0000 UTC m=+5029.286008341" Sep 29 20:07:12 crc kubenswrapper[4780]: I0929 20:07:12.753148 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:07:12 crc kubenswrapper[4780]: E0929 20:07:12.753628 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:07:26 crc kubenswrapper[4780]: I0929 20:07:26.346488 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 29 20:07:26 crc kubenswrapper[4780]: I0929 20:07:26.353380 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 29 20:07:26 crc kubenswrapper[4780]: I0929 20:07:26.402646 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.402617551 podStartE2EDuration="51.402617551s" podCreationTimestamp="2025-09-29 20:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:07:09.365636786 +0000 UTC m=+5029.313934840" watchObservedRunningTime="2025-09-29 20:07:26.402617551 +0000 UTC m=+5046.350915645" Sep 29 20:07:27 crc kubenswrapper[4780]: I0929 20:07:27.753671 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:07:27 crc kubenswrapper[4780]: E0929 20:07:27.754471 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.427538 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Sep 29 20:07:29 crc kubenswrapper[4780]: E0929 20:07:29.428305 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerName="extract-content" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.428325 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerName="extract-content" Sep 29 20:07:29 crc kubenswrapper[4780]: E0929 20:07:29.428353 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerName="registry-server" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.428361 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerName="registry-server" Sep 29 20:07:29 crc kubenswrapper[4780]: E0929 20:07:29.428381 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerName="extract-utilities" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.428390 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerName="extract-utilities" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.428634 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab06f91-ae72-4c71-b69f-f951de4c0ed3" containerName="registry-server" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.429416 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.432691 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5lsnv" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.455784 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.500856 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcj5w\" (UniqueName: \"kubernetes.io/projected/9569740b-0c88-4a12-be4b-fb5f8b988fe8-kube-api-access-qcj5w\") pod \"mariadb-client-1-default\" (UID: \"9569740b-0c88-4a12-be4b-fb5f8b988fe8\") " pod="openstack/mariadb-client-1-default" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.602149 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcj5w\" (UniqueName: \"kubernetes.io/projected/9569740b-0c88-4a12-be4b-fb5f8b988fe8-kube-api-access-qcj5w\") pod \"mariadb-client-1-default\" (UID: \"9569740b-0c88-4a12-be4b-fb5f8b988fe8\") " pod="openstack/mariadb-client-1-default" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.624960 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcj5w\" (UniqueName: \"kubernetes.io/projected/9569740b-0c88-4a12-be4b-fb5f8b988fe8-kube-api-access-qcj5w\") pod \"mariadb-client-1-default\" (UID: \"9569740b-0c88-4a12-be4b-fb5f8b988fe8\") " pod="openstack/mariadb-client-1-default" Sep 29 20:07:29 crc kubenswrapper[4780]: I0929 20:07:29.758434 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Sep 29 20:07:30 crc kubenswrapper[4780]: I0929 20:07:30.163986 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Sep 29 20:07:30 crc kubenswrapper[4780]: W0929 20:07:30.171141 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9569740b_0c88_4a12_be4b_fb5f8b988fe8.slice/crio-d94ac0aaea72b4d0f344004a14c64cb1ecf118b91b2813cc279711727ec55dea WatchSource:0}: Error finding container d94ac0aaea72b4d0f344004a14c64cb1ecf118b91b2813cc279711727ec55dea: Status 404 returned error can't find the container with id d94ac0aaea72b4d0f344004a14c64cb1ecf118b91b2813cc279711727ec55dea Sep 29 20:07:30 crc kubenswrapper[4780]: I0929 20:07:30.522235 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"9569740b-0c88-4a12-be4b-fb5f8b988fe8","Type":"ContainerStarted","Data":"d94ac0aaea72b4d0f344004a14c64cb1ecf118b91b2813cc279711727ec55dea"} Sep 29 20:07:31 crc kubenswrapper[4780]: I0929 20:07:31.536195 4780 generic.go:334] "Generic (PLEG): container finished" podID="9569740b-0c88-4a12-be4b-fb5f8b988fe8" containerID="7eabb7045a672a44c7c8dd1326facae09d8047f2d4932a74bb92829e3c67f6fe" exitCode=0 Sep 29 20:07:31 crc kubenswrapper[4780]: I0929 20:07:31.536303 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"9569740b-0c88-4a12-be4b-fb5f8b988fe8","Type":"ContainerDied","Data":"7eabb7045a672a44c7c8dd1326facae09d8047f2d4932a74bb92829e3c67f6fe"} Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.010807 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.049558 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_9569740b-0c88-4a12-be4b-fb5f8b988fe8/mariadb-client-1-default/0.log" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.070885 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcj5w\" (UniqueName: \"kubernetes.io/projected/9569740b-0c88-4a12-be4b-fb5f8b988fe8-kube-api-access-qcj5w\") pod \"9569740b-0c88-4a12-be4b-fb5f8b988fe8\" (UID: \"9569740b-0c88-4a12-be4b-fb5f8b988fe8\") " Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.083250 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.084654 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9569740b-0c88-4a12-be4b-fb5f8b988fe8-kube-api-access-qcj5w" (OuterVolumeSpecName: "kube-api-access-qcj5w") pod "9569740b-0c88-4a12-be4b-fb5f8b988fe8" (UID: "9569740b-0c88-4a12-be4b-fb5f8b988fe8"). InnerVolumeSpecName "kube-api-access-qcj5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.088915 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.172221 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcj5w\" (UniqueName: \"kubernetes.io/projected/9569740b-0c88-4a12-be4b-fb5f8b988fe8-kube-api-access-qcj5w\") on node \"crc\" DevicePath \"\"" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.488200 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Sep 29 20:07:33 crc kubenswrapper[4780]: E0929 20:07:33.488924 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9569740b-0c88-4a12-be4b-fb5f8b988fe8" containerName="mariadb-client-1-default" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.488948 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9569740b-0c88-4a12-be4b-fb5f8b988fe8" containerName="mariadb-client-1-default" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.489199 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="9569740b-0c88-4a12-be4b-fb5f8b988fe8" containerName="mariadb-client-1-default" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.489796 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.507429 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.560432 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d94ac0aaea72b4d0f344004a14c64cb1ecf118b91b2813cc279711727ec55dea" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.560682 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.579231 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnb2c\" (UniqueName: \"kubernetes.io/projected/a39aef75-4ea6-4200-ab41-eb7931f38eca-kube-api-access-jnb2c\") pod \"mariadb-client-2-default\" (UID: \"a39aef75-4ea6-4200-ab41-eb7931f38eca\") " pod="openstack/mariadb-client-2-default" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.680533 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnb2c\" (UniqueName: \"kubernetes.io/projected/a39aef75-4ea6-4200-ab41-eb7931f38eca-kube-api-access-jnb2c\") pod \"mariadb-client-2-default\" (UID: \"a39aef75-4ea6-4200-ab41-eb7931f38eca\") " pod="openstack/mariadb-client-2-default" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.704904 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnb2c\" (UniqueName: \"kubernetes.io/projected/a39aef75-4ea6-4200-ab41-eb7931f38eca-kube-api-access-jnb2c\") pod \"mariadb-client-2-default\" (UID: \"a39aef75-4ea6-4200-ab41-eb7931f38eca\") " pod="openstack/mariadb-client-2-default" Sep 29 20:07:33 crc kubenswrapper[4780]: I0929 20:07:33.822466 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Sep 29 20:07:34 crc kubenswrapper[4780]: I0929 20:07:34.436940 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Sep 29 20:07:34 crc kubenswrapper[4780]: W0929 20:07:34.444343 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39aef75_4ea6_4200_ab41_eb7931f38eca.slice/crio-af4ed2fed32abb2ad70e93c0ea3524fc1b943ea627f9db8bbc71b08ea99c1022 WatchSource:0}: Error finding container af4ed2fed32abb2ad70e93c0ea3524fc1b943ea627f9db8bbc71b08ea99c1022: Status 404 returned error can't find the container with id af4ed2fed32abb2ad70e93c0ea3524fc1b943ea627f9db8bbc71b08ea99c1022 Sep 29 20:07:34 crc kubenswrapper[4780]: I0929 20:07:34.572369 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"a39aef75-4ea6-4200-ab41-eb7931f38eca","Type":"ContainerStarted","Data":"af4ed2fed32abb2ad70e93c0ea3524fc1b943ea627f9db8bbc71b08ea99c1022"} Sep 29 20:07:34 crc kubenswrapper[4780]: I0929 20:07:34.769001 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9569740b-0c88-4a12-be4b-fb5f8b988fe8" path="/var/lib/kubelet/pods/9569740b-0c88-4a12-be4b-fb5f8b988fe8/volumes" Sep 29 20:07:35 crc kubenswrapper[4780]: I0929 20:07:35.588135 4780 generic.go:334] "Generic (PLEG): container finished" podID="a39aef75-4ea6-4200-ab41-eb7931f38eca" containerID="6aa52cc784f9990a21bf658a4fdb458f00aec85dcc0246dc86e67b23cce2e165" exitCode=0 Sep 29 20:07:35 crc kubenswrapper[4780]: I0929 20:07:35.588333 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"a39aef75-4ea6-4200-ab41-eb7931f38eca","Type":"ContainerDied","Data":"6aa52cc784f9990a21bf658a4fdb458f00aec85dcc0246dc86e67b23cce2e165"} Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.072892 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.138896 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnb2c\" (UniqueName: \"kubernetes.io/projected/a39aef75-4ea6-4200-ab41-eb7931f38eca-kube-api-access-jnb2c\") pod \"a39aef75-4ea6-4200-ab41-eb7931f38eca\" (UID: \"a39aef75-4ea6-4200-ab41-eb7931f38eca\") " Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.147349 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a39aef75-4ea6-4200-ab41-eb7931f38eca-kube-api-access-jnb2c" (OuterVolumeSpecName: "kube-api-access-jnb2c") pod "a39aef75-4ea6-4200-ab41-eb7931f38eca" (UID: "a39aef75-4ea6-4200-ab41-eb7931f38eca"). InnerVolumeSpecName "kube-api-access-jnb2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.147374 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_a39aef75-4ea6-4200-ab41-eb7931f38eca/mariadb-client-2-default/0.log" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.175571 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.183525 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.241188 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnb2c\" (UniqueName: \"kubernetes.io/projected/a39aef75-4ea6-4200-ab41-eb7931f38eca-kube-api-access-jnb2c\") on node \"crc\" DevicePath \"\"" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.587282 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Sep 29 20:07:37 crc kubenswrapper[4780]: E0929 20:07:37.587784 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39aef75-4ea6-4200-ab41-eb7931f38eca" containerName="mariadb-client-2-default" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.587797 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39aef75-4ea6-4200-ab41-eb7931f38eca" containerName="mariadb-client-2-default" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.587946 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39aef75-4ea6-4200-ab41-eb7931f38eca" containerName="mariadb-client-2-default" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.588401 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.608212 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.624548 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af4ed2fed32abb2ad70e93c0ea3524fc1b943ea627f9db8bbc71b08ea99c1022" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.624646 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.646638 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv6v9\" (UniqueName: \"kubernetes.io/projected/1a54bf7d-2a17-4532-a08b-0911a9ada98c-kube-api-access-lv6v9\") pod \"mariadb-client-1\" (UID: \"1a54bf7d-2a17-4532-a08b-0911a9ada98c\") " pod="openstack/mariadb-client-1" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.748676 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv6v9\" (UniqueName: \"kubernetes.io/projected/1a54bf7d-2a17-4532-a08b-0911a9ada98c-kube-api-access-lv6v9\") pod \"mariadb-client-1\" (UID: \"1a54bf7d-2a17-4532-a08b-0911a9ada98c\") " pod="openstack/mariadb-client-1" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.783630 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv6v9\" (UniqueName: \"kubernetes.io/projected/1a54bf7d-2a17-4532-a08b-0911a9ada98c-kube-api-access-lv6v9\") pod \"mariadb-client-1\" (UID: \"1a54bf7d-2a17-4532-a08b-0911a9ada98c\") " pod="openstack/mariadb-client-1" Sep 29 20:07:37 crc kubenswrapper[4780]: I0929 20:07:37.911914 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Sep 29 20:07:38 crc kubenswrapper[4780]: I0929 20:07:38.292727 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Sep 29 20:07:38 crc kubenswrapper[4780]: I0929 20:07:38.635693 4780 generic.go:334] "Generic (PLEG): container finished" podID="1a54bf7d-2a17-4532-a08b-0911a9ada98c" containerID="55b24169fd6f112ae7bc335f57093ec95562a5188a6c8834c7ab3de95933b579" exitCode=0 Sep 29 20:07:38 crc kubenswrapper[4780]: I0929 20:07:38.635762 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"1a54bf7d-2a17-4532-a08b-0911a9ada98c","Type":"ContainerDied","Data":"55b24169fd6f112ae7bc335f57093ec95562a5188a6c8834c7ab3de95933b579"} Sep 29 20:07:38 crc kubenswrapper[4780]: I0929 20:07:38.635847 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"1a54bf7d-2a17-4532-a08b-0911a9ada98c","Type":"ContainerStarted","Data":"34d746bb21b18af40348609b82baa942b2ba4e7dcaac5191927d82e818c5a42e"} Sep 29 20:07:38 crc kubenswrapper[4780]: I0929 20:07:38.770816 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a39aef75-4ea6-4200-ab41-eb7931f38eca" path="/var/lib/kubelet/pods/a39aef75-4ea6-4200-ab41-eb7931f38eca/volumes" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.091218 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.112174 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_1a54bf7d-2a17-4532-a08b-0911a9ada98c/mariadb-client-1/0.log" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.141928 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.155542 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.185635 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv6v9\" (UniqueName: \"kubernetes.io/projected/1a54bf7d-2a17-4532-a08b-0911a9ada98c-kube-api-access-lv6v9\") pod \"1a54bf7d-2a17-4532-a08b-0911a9ada98c\" (UID: \"1a54bf7d-2a17-4532-a08b-0911a9ada98c\") " Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.191810 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a54bf7d-2a17-4532-a08b-0911a9ada98c-kube-api-access-lv6v9" (OuterVolumeSpecName: "kube-api-access-lv6v9") pod "1a54bf7d-2a17-4532-a08b-0911a9ada98c" (UID: "1a54bf7d-2a17-4532-a08b-0911a9ada98c"). InnerVolumeSpecName "kube-api-access-lv6v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.288012 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv6v9\" (UniqueName: \"kubernetes.io/projected/1a54bf7d-2a17-4532-a08b-0911a9ada98c-kube-api-access-lv6v9\") on node \"crc\" DevicePath \"\"" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.549509 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Sep 29 20:07:40 crc kubenswrapper[4780]: E0929 20:07:40.552367 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a54bf7d-2a17-4532-a08b-0911a9ada98c" containerName="mariadb-client-1" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.552420 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a54bf7d-2a17-4532-a08b-0911a9ada98c" containerName="mariadb-client-1" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.553156 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a54bf7d-2a17-4532-a08b-0911a9ada98c" containerName="mariadb-client-1" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.554604 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.562703 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.593284 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxkbh\" (UniqueName: \"kubernetes.io/projected/14b942dc-a85f-4e4c-9e38-332aa725ba96-kube-api-access-pxkbh\") pod \"mariadb-client-4-default\" (UID: \"14b942dc-a85f-4e4c-9e38-332aa725ba96\") " pod="openstack/mariadb-client-4-default" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.658979 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34d746bb21b18af40348609b82baa942b2ba4e7dcaac5191927d82e818c5a42e" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.659177 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.694093 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxkbh\" (UniqueName: \"kubernetes.io/projected/14b942dc-a85f-4e4c-9e38-332aa725ba96-kube-api-access-pxkbh\") pod \"mariadb-client-4-default\" (UID: \"14b942dc-a85f-4e4c-9e38-332aa725ba96\") " pod="openstack/mariadb-client-4-default" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.715145 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxkbh\" (UniqueName: \"kubernetes.io/projected/14b942dc-a85f-4e4c-9e38-332aa725ba96-kube-api-access-pxkbh\") pod \"mariadb-client-4-default\" (UID: \"14b942dc-a85f-4e4c-9e38-332aa725ba96\") " pod="openstack/mariadb-client-4-default" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.761346 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:07:40 crc kubenswrapper[4780]: E0929 20:07:40.761573 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.765247 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a54bf7d-2a17-4532-a08b-0911a9ada98c" path="/var/lib/kubelet/pods/1a54bf7d-2a17-4532-a08b-0911a9ada98c/volumes" Sep 29 20:07:40 crc kubenswrapper[4780]: I0929 20:07:40.890626 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Sep 29 20:07:41 crc kubenswrapper[4780]: I0929 20:07:41.569286 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Sep 29 20:07:41 crc kubenswrapper[4780]: W0929 20:07:41.581387 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b942dc_a85f_4e4c_9e38_332aa725ba96.slice/crio-7dbbf5fb16c8c6688b16cdfe8d20f770f40c24ed1dc6cc710d2c5ea086072e63 WatchSource:0}: Error finding container 7dbbf5fb16c8c6688b16cdfe8d20f770f40c24ed1dc6cc710d2c5ea086072e63: Status 404 returned error can't find the container with id 7dbbf5fb16c8c6688b16cdfe8d20f770f40c24ed1dc6cc710d2c5ea086072e63 Sep 29 20:07:41 crc kubenswrapper[4780]: I0929 20:07:41.669646 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"14b942dc-a85f-4e4c-9e38-332aa725ba96","Type":"ContainerStarted","Data":"7dbbf5fb16c8c6688b16cdfe8d20f770f40c24ed1dc6cc710d2c5ea086072e63"} Sep 29 20:07:42 crc kubenswrapper[4780]: I0929 20:07:42.682298 4780 generic.go:334] "Generic (PLEG): container finished" podID="14b942dc-a85f-4e4c-9e38-332aa725ba96" containerID="eca1d1effa6b407393527a27880255d5f98319d0367241d8bdecd7924be6bc13" exitCode=0 Sep 29 20:07:42 crc kubenswrapper[4780]: I0929 20:07:42.682387 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"14b942dc-a85f-4e4c-9e38-332aa725ba96","Type":"ContainerDied","Data":"eca1d1effa6b407393527a27880255d5f98319d0367241d8bdecd7924be6bc13"} Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.173934 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.193251 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_14b942dc-a85f-4e4c-9e38-332aa725ba96/mariadb-client-4-default/0.log" Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.222916 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.229035 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.256121 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxkbh\" (UniqueName: \"kubernetes.io/projected/14b942dc-a85f-4e4c-9e38-332aa725ba96-kube-api-access-pxkbh\") pod \"14b942dc-a85f-4e4c-9e38-332aa725ba96\" (UID: \"14b942dc-a85f-4e4c-9e38-332aa725ba96\") " Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.266531 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b942dc-a85f-4e4c-9e38-332aa725ba96-kube-api-access-pxkbh" (OuterVolumeSpecName: "kube-api-access-pxkbh") pod "14b942dc-a85f-4e4c-9e38-332aa725ba96" (UID: "14b942dc-a85f-4e4c-9e38-332aa725ba96"). InnerVolumeSpecName "kube-api-access-pxkbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.357592 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxkbh\" (UniqueName: \"kubernetes.io/projected/14b942dc-a85f-4e4c-9e38-332aa725ba96-kube-api-access-pxkbh\") on node \"crc\" DevicePath \"\"" Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.706439 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dbbf5fb16c8c6688b16cdfe8d20f770f40c24ed1dc6cc710d2c5ea086072e63" Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.706556 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Sep 29 20:07:44 crc kubenswrapper[4780]: I0929 20:07:44.774147 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b942dc-a85f-4e4c-9e38-332aa725ba96" path="/var/lib/kubelet/pods/14b942dc-a85f-4e4c-9e38-332aa725ba96/volumes" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.132323 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Sep 29 20:07:48 crc kubenswrapper[4780]: E0929 20:07:48.133763 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b942dc-a85f-4e4c-9e38-332aa725ba96" containerName="mariadb-client-4-default" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.133784 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b942dc-a85f-4e4c-9e38-332aa725ba96" containerName="mariadb-client-4-default" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.134011 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b942dc-a85f-4e4c-9e38-332aa725ba96" containerName="mariadb-client-4-default" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.134617 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.138706 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5lsnv" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.144512 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.230852 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8mfv\" (UniqueName: \"kubernetes.io/projected/4064a95e-63a2-40f8-8e95-fd1131c95d1a-kube-api-access-z8mfv\") pod \"mariadb-client-5-default\" (UID: \"4064a95e-63a2-40f8-8e95-fd1131c95d1a\") " pod="openstack/mariadb-client-5-default" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.332464 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8mfv\" (UniqueName: \"kubernetes.io/projected/4064a95e-63a2-40f8-8e95-fd1131c95d1a-kube-api-access-z8mfv\") pod \"mariadb-client-5-default\" (UID: \"4064a95e-63a2-40f8-8e95-fd1131c95d1a\") " pod="openstack/mariadb-client-5-default" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.369332 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8mfv\" (UniqueName: \"kubernetes.io/projected/4064a95e-63a2-40f8-8e95-fd1131c95d1a-kube-api-access-z8mfv\") pod \"mariadb-client-5-default\" (UID: \"4064a95e-63a2-40f8-8e95-fd1131c95d1a\") " pod="openstack/mariadb-client-5-default" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.462550 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Sep 29 20:07:48 crc kubenswrapper[4780]: I0929 20:07:48.869453 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Sep 29 20:07:49 crc kubenswrapper[4780]: I0929 20:07:49.756019 4780 generic.go:334] "Generic (PLEG): container finished" podID="4064a95e-63a2-40f8-8e95-fd1131c95d1a" containerID="781f7b6b7335269720839f04ca793aaef6cdc37db4aae5835f8da5977964224e" exitCode=0 Sep 29 20:07:49 crc kubenswrapper[4780]: I0929 20:07:49.756090 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"4064a95e-63a2-40f8-8e95-fd1131c95d1a","Type":"ContainerDied","Data":"781f7b6b7335269720839f04ca793aaef6cdc37db4aae5835f8da5977964224e"} Sep 29 20:07:49 crc kubenswrapper[4780]: I0929 20:07:49.756122 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"4064a95e-63a2-40f8-8e95-fd1131c95d1a","Type":"ContainerStarted","Data":"5de18a827373e737e4e53d0afb0af003d005329c387d3efe05f37c297ec3eb13"} Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.290851 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.313008 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_4064a95e-63a2-40f8-8e95-fd1131c95d1a/mariadb-client-5-default/0.log" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.345271 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.352472 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.393018 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8mfv\" (UniqueName: \"kubernetes.io/projected/4064a95e-63a2-40f8-8e95-fd1131c95d1a-kube-api-access-z8mfv\") pod \"4064a95e-63a2-40f8-8e95-fd1131c95d1a\" (UID: \"4064a95e-63a2-40f8-8e95-fd1131c95d1a\") " Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.401087 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4064a95e-63a2-40f8-8e95-fd1131c95d1a-kube-api-access-z8mfv" (OuterVolumeSpecName: "kube-api-access-z8mfv") pod "4064a95e-63a2-40f8-8e95-fd1131c95d1a" (UID: "4064a95e-63a2-40f8-8e95-fd1131c95d1a"). InnerVolumeSpecName "kube-api-access-z8mfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.495310 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8mfv\" (UniqueName: \"kubernetes.io/projected/4064a95e-63a2-40f8-8e95-fd1131c95d1a-kube-api-access-z8mfv\") on node \"crc\" DevicePath \"\"" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.504193 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Sep 29 20:07:51 crc kubenswrapper[4780]: E0929 20:07:51.504657 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4064a95e-63a2-40f8-8e95-fd1131c95d1a" containerName="mariadb-client-5-default" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.504691 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="4064a95e-63a2-40f8-8e95-fd1131c95d1a" containerName="mariadb-client-5-default" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.504917 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="4064a95e-63a2-40f8-8e95-fd1131c95d1a" containerName="mariadb-client-5-default" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.505568 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.513552 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.597216 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xq8b\" (UniqueName: \"kubernetes.io/projected/76a02c19-52ef-4cbd-a128-9bbbe9d55e68-kube-api-access-8xq8b\") pod \"mariadb-client-6-default\" (UID: \"76a02c19-52ef-4cbd-a128-9bbbe9d55e68\") " pod="openstack/mariadb-client-6-default" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.698915 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xq8b\" (UniqueName: \"kubernetes.io/projected/76a02c19-52ef-4cbd-a128-9bbbe9d55e68-kube-api-access-8xq8b\") pod \"mariadb-client-6-default\" (UID: \"76a02c19-52ef-4cbd-a128-9bbbe9d55e68\") " pod="openstack/mariadb-client-6-default" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.725851 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xq8b\" (UniqueName: \"kubernetes.io/projected/76a02c19-52ef-4cbd-a128-9bbbe9d55e68-kube-api-access-8xq8b\") pod \"mariadb-client-6-default\" (UID: \"76a02c19-52ef-4cbd-a128-9bbbe9d55e68\") " pod="openstack/mariadb-client-6-default" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.778898 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5de18a827373e737e4e53d0afb0af003d005329c387d3efe05f37c297ec3eb13" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.778980 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Sep 29 20:07:51 crc kubenswrapper[4780]: I0929 20:07:51.832660 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Sep 29 20:07:52 crc kubenswrapper[4780]: I0929 20:07:52.186728 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Sep 29 20:07:52 crc kubenswrapper[4780]: W0929 20:07:52.190609 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76a02c19_52ef_4cbd_a128_9bbbe9d55e68.slice/crio-1bed000708d0ca2a50b089f0ccd23d61b99c04ed75047b9cb0e71b491f649c4d WatchSource:0}: Error finding container 1bed000708d0ca2a50b089f0ccd23d61b99c04ed75047b9cb0e71b491f649c4d: Status 404 returned error can't find the container with id 1bed000708d0ca2a50b089f0ccd23d61b99c04ed75047b9cb0e71b491f649c4d Sep 29 20:07:52 crc kubenswrapper[4780]: I0929 20:07:52.765455 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4064a95e-63a2-40f8-8e95-fd1131c95d1a" path="/var/lib/kubelet/pods/4064a95e-63a2-40f8-8e95-fd1131c95d1a/volumes" Sep 29 20:07:52 crc kubenswrapper[4780]: I0929 20:07:52.791438 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"76a02c19-52ef-4cbd-a128-9bbbe9d55e68","Type":"ContainerStarted","Data":"9310dbff183335c1f7278fa86f7cc8bf66b096f6eb2ff0028846b2903ef2d08a"} Sep 29 20:07:52 crc kubenswrapper[4780]: I0929 20:07:52.791499 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"76a02c19-52ef-4cbd-a128-9bbbe9d55e68","Type":"ContainerStarted","Data":"1bed000708d0ca2a50b089f0ccd23d61b99c04ed75047b9cb0e71b491f649c4d"} Sep 29 20:07:52 crc kubenswrapper[4780]: I0929 20:07:52.808866 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.808834403 podStartE2EDuration="1.808834403s" podCreationTimestamp="2025-09-29 20:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:07:52.808260426 +0000 UTC m=+5072.756558480" watchObservedRunningTime="2025-09-29 20:07:52.808834403 +0000 UTC m=+5072.757132487" Sep 29 20:07:52 crc kubenswrapper[4780]: I0929 20:07:52.900989 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_76a02c19-52ef-4cbd-a128-9bbbe9d55e68/mariadb-client-6-default/0.log" Sep 29 20:07:53 crc kubenswrapper[4780]: I0929 20:07:53.803147 4780 generic.go:334] "Generic (PLEG): container finished" podID="76a02c19-52ef-4cbd-a128-9bbbe9d55e68" containerID="9310dbff183335c1f7278fa86f7cc8bf66b096f6eb2ff0028846b2903ef2d08a" exitCode=0 Sep 29 20:07:53 crc kubenswrapper[4780]: I0929 20:07:53.803240 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"76a02c19-52ef-4cbd-a128-9bbbe9d55e68","Type":"ContainerDied","Data":"9310dbff183335c1f7278fa86f7cc8bf66b096f6eb2ff0028846b2903ef2d08a"} Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.306951 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.377716 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.387820 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.474587 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xq8b\" (UniqueName: \"kubernetes.io/projected/76a02c19-52ef-4cbd-a128-9bbbe9d55e68-kube-api-access-8xq8b\") pod \"76a02c19-52ef-4cbd-a128-9bbbe9d55e68\" (UID: \"76a02c19-52ef-4cbd-a128-9bbbe9d55e68\") " Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.482491 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a02c19-52ef-4cbd-a128-9bbbe9d55e68-kube-api-access-8xq8b" (OuterVolumeSpecName: "kube-api-access-8xq8b") pod "76a02c19-52ef-4cbd-a128-9bbbe9d55e68" (UID: "76a02c19-52ef-4cbd-a128-9bbbe9d55e68"). InnerVolumeSpecName "kube-api-access-8xq8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.521394 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Sep 29 20:07:55 crc kubenswrapper[4780]: E0929 20:07:55.521985 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a02c19-52ef-4cbd-a128-9bbbe9d55e68" containerName="mariadb-client-6-default" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.522031 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a02c19-52ef-4cbd-a128-9bbbe9d55e68" containerName="mariadb-client-6-default" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.522346 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a02c19-52ef-4cbd-a128-9bbbe9d55e68" containerName="mariadb-client-6-default" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.523351 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.526458 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.576478 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xq8b\" (UniqueName: \"kubernetes.io/projected/76a02c19-52ef-4cbd-a128-9bbbe9d55e68-kube-api-access-8xq8b\") on node \"crc\" DevicePath \"\"" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.678341 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7fz7\" (UniqueName: \"kubernetes.io/projected/5e6d5106-1ba3-49b8-aec3-53ccbb065422-kube-api-access-l7fz7\") pod \"mariadb-client-7-default\" (UID: \"5e6d5106-1ba3-49b8-aec3-53ccbb065422\") " pod="openstack/mariadb-client-7-default" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.753767 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:07:55 crc kubenswrapper[4780]: E0929 20:07:55.754211 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.779996 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7fz7\" (UniqueName: \"kubernetes.io/projected/5e6d5106-1ba3-49b8-aec3-53ccbb065422-kube-api-access-l7fz7\") pod \"mariadb-client-7-default\" (UID: \"5e6d5106-1ba3-49b8-aec3-53ccbb065422\") " pod="openstack/mariadb-client-7-default" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.810162 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7fz7\" (UniqueName: \"kubernetes.io/projected/5e6d5106-1ba3-49b8-aec3-53ccbb065422-kube-api-access-l7fz7\") pod \"mariadb-client-7-default\" (UID: \"5e6d5106-1ba3-49b8-aec3-53ccbb065422\") " pod="openstack/mariadb-client-7-default" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.829889 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bed000708d0ca2a50b089f0ccd23d61b99c04ed75047b9cb0e71b491f649c4d" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.830000 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Sep 29 20:07:55 crc kubenswrapper[4780]: I0929 20:07:55.861361 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Sep 29 20:07:56 crc kubenswrapper[4780]: I0929 20:07:56.214912 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Sep 29 20:07:56 crc kubenswrapper[4780]: W0929 20:07:56.219762 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e6d5106_1ba3_49b8_aec3_53ccbb065422.slice/crio-b7912fea198785ed66bce7438690b9d3fa31e1343f9861aa6a9045d8ab6ef20c WatchSource:0}: Error finding container b7912fea198785ed66bce7438690b9d3fa31e1343f9861aa6a9045d8ab6ef20c: Status 404 returned error can't find the container with id b7912fea198785ed66bce7438690b9d3fa31e1343f9861aa6a9045d8ab6ef20c Sep 29 20:07:56 crc kubenswrapper[4780]: I0929 20:07:56.788765 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a02c19-52ef-4cbd-a128-9bbbe9d55e68" path="/var/lib/kubelet/pods/76a02c19-52ef-4cbd-a128-9bbbe9d55e68/volumes" Sep 29 20:07:56 crc kubenswrapper[4780]: I0929 20:07:56.841428 4780 generic.go:334] "Generic (PLEG): container finished" podID="5e6d5106-1ba3-49b8-aec3-53ccbb065422" containerID="e18bb61ee1665354a5498fc724f754ee4e0fc915ed426f3bd3de58ae0808a663" exitCode=0 Sep 29 20:07:56 crc kubenswrapper[4780]: I0929 20:07:56.841471 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"5e6d5106-1ba3-49b8-aec3-53ccbb065422","Type":"ContainerDied","Data":"e18bb61ee1665354a5498fc724f754ee4e0fc915ed426f3bd3de58ae0808a663"} Sep 29 20:07:56 crc kubenswrapper[4780]: I0929 20:07:56.841497 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"5e6d5106-1ba3-49b8-aec3-53ccbb065422","Type":"ContainerStarted","Data":"b7912fea198785ed66bce7438690b9d3fa31e1343f9861aa6a9045d8ab6ef20c"} Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.242527 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.259844 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_5e6d5106-1ba3-49b8-aec3-53ccbb065422/mariadb-client-7-default/0.log" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.283206 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.288546 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.344792 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7fz7\" (UniqueName: \"kubernetes.io/projected/5e6d5106-1ba3-49b8-aec3-53ccbb065422-kube-api-access-l7fz7\") pod \"5e6d5106-1ba3-49b8-aec3-53ccbb065422\" (UID: \"5e6d5106-1ba3-49b8-aec3-53ccbb065422\") " Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.353716 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6d5106-1ba3-49b8-aec3-53ccbb065422-kube-api-access-l7fz7" (OuterVolumeSpecName: "kube-api-access-l7fz7") pod "5e6d5106-1ba3-49b8-aec3-53ccbb065422" (UID: "5e6d5106-1ba3-49b8-aec3-53ccbb065422"). InnerVolumeSpecName "kube-api-access-l7fz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.425938 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Sep 29 20:07:58 crc kubenswrapper[4780]: E0929 20:07:58.426397 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6d5106-1ba3-49b8-aec3-53ccbb065422" containerName="mariadb-client-7-default" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.426414 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6d5106-1ba3-49b8-aec3-53ccbb065422" containerName="mariadb-client-7-default" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.426630 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6d5106-1ba3-49b8-aec3-53ccbb065422" containerName="mariadb-client-7-default" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.427477 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.445695 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.445988 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kmj7\" (UniqueName: \"kubernetes.io/projected/c3710015-a813-47e7-a670-11fcc28c1901-kube-api-access-8kmj7\") pod \"mariadb-client-2\" (UID: \"c3710015-a813-47e7-a670-11fcc28c1901\") " pod="openstack/mariadb-client-2" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.447119 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7fz7\" (UniqueName: \"kubernetes.io/projected/5e6d5106-1ba3-49b8-aec3-53ccbb065422-kube-api-access-l7fz7\") on node \"crc\" DevicePath \"\"" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.547564 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kmj7\" (UniqueName: \"kubernetes.io/projected/c3710015-a813-47e7-a670-11fcc28c1901-kube-api-access-8kmj7\") pod \"mariadb-client-2\" (UID: \"c3710015-a813-47e7-a670-11fcc28c1901\") " pod="openstack/mariadb-client-2" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.577832 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kmj7\" (UniqueName: \"kubernetes.io/projected/c3710015-a813-47e7-a670-11fcc28c1901-kube-api-access-8kmj7\") pod \"mariadb-client-2\" (UID: \"c3710015-a813-47e7-a670-11fcc28c1901\") " pod="openstack/mariadb-client-2" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.753424 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.774993 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e6d5106-1ba3-49b8-aec3-53ccbb065422" path="/var/lib/kubelet/pods/5e6d5106-1ba3-49b8-aec3-53ccbb065422/volumes" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.880404 4780 scope.go:117] "RemoveContainer" containerID="e18bb61ee1665354a5498fc724f754ee4e0fc915ed426f3bd3de58ae0808a663" Sep 29 20:07:58 crc kubenswrapper[4780]: I0929 20:07:58.880596 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Sep 29 20:07:59 crc kubenswrapper[4780]: I0929 20:07:59.357195 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Sep 29 20:07:59 crc kubenswrapper[4780]: I0929 20:07:59.894500 4780 generic.go:334] "Generic (PLEG): container finished" podID="c3710015-a813-47e7-a670-11fcc28c1901" containerID="db8e9ca3dcc446f1d34ff93b944814248b5d82fb10c7588cc1a295c01bb674a7" exitCode=0 Sep 29 20:07:59 crc kubenswrapper[4780]: I0929 20:07:59.894595 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"c3710015-a813-47e7-a670-11fcc28c1901","Type":"ContainerDied","Data":"db8e9ca3dcc446f1d34ff93b944814248b5d82fb10c7588cc1a295c01bb674a7"} Sep 29 20:07:59 crc kubenswrapper[4780]: I0929 20:07:59.894879 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"c3710015-a813-47e7-a670-11fcc28c1901","Type":"ContainerStarted","Data":"47d1bdbe87cca7c5725a3ca69ad78ec2aaa416eeb762fa9a78206859ffb03700"} Sep 29 20:08:01 crc kubenswrapper[4780]: I0929 20:08:01.409216 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Sep 29 20:08:01 crc kubenswrapper[4780]: I0929 20:08:01.427986 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_c3710015-a813-47e7-a670-11fcc28c1901/mariadb-client-2/0.log" Sep 29 20:08:01 crc kubenswrapper[4780]: I0929 20:08:01.450902 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Sep 29 20:08:01 crc kubenswrapper[4780]: I0929 20:08:01.455981 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Sep 29 20:08:01 crc kubenswrapper[4780]: I0929 20:08:01.598402 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kmj7\" (UniqueName: \"kubernetes.io/projected/c3710015-a813-47e7-a670-11fcc28c1901-kube-api-access-8kmj7\") pod \"c3710015-a813-47e7-a670-11fcc28c1901\" (UID: \"c3710015-a813-47e7-a670-11fcc28c1901\") " Sep 29 20:08:01 crc kubenswrapper[4780]: I0929 20:08:01.606464 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3710015-a813-47e7-a670-11fcc28c1901-kube-api-access-8kmj7" (OuterVolumeSpecName: "kube-api-access-8kmj7") pod "c3710015-a813-47e7-a670-11fcc28c1901" (UID: "c3710015-a813-47e7-a670-11fcc28c1901"). InnerVolumeSpecName "kube-api-access-8kmj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:08:01 crc kubenswrapper[4780]: I0929 20:08:01.700916 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kmj7\" (UniqueName: \"kubernetes.io/projected/c3710015-a813-47e7-a670-11fcc28c1901-kube-api-access-8kmj7\") on node \"crc\" DevicePath \"\"" Sep 29 20:08:01 crc kubenswrapper[4780]: I0929 20:08:01.916586 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47d1bdbe87cca7c5725a3ca69ad78ec2aaa416eeb762fa9a78206859ffb03700" Sep 29 20:08:01 crc kubenswrapper[4780]: I0929 20:08:01.916698 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Sep 29 20:08:02 crc kubenswrapper[4780]: I0929 20:08:02.772125 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3710015-a813-47e7-a670-11fcc28c1901" path="/var/lib/kubelet/pods/c3710015-a813-47e7-a670-11fcc28c1901/volumes" Sep 29 20:08:06 crc kubenswrapper[4780]: I0929 20:08:06.753456 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:08:06 crc kubenswrapper[4780]: E0929 20:08:06.754327 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:08:20 crc kubenswrapper[4780]: I0929 20:08:20.764137 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:08:20 crc kubenswrapper[4780]: E0929 20:08:20.765238 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:08:33 crc kubenswrapper[4780]: I0929 20:08:33.753179 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:08:33 crc kubenswrapper[4780]: E0929 20:08:33.754631 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:08:44 crc kubenswrapper[4780]: I0929 20:08:44.753521 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:08:44 crc kubenswrapper[4780]: E0929 20:08:44.754564 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:08:56 crc kubenswrapper[4780]: I0929 20:08:56.754464 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:08:56 crc kubenswrapper[4780]: E0929 20:08:56.755843 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:09:10 crc kubenswrapper[4780]: I0929 20:09:10.761615 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:09:10 crc kubenswrapper[4780]: E0929 20:09:10.762943 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:09:25 crc kubenswrapper[4780]: I0929 20:09:25.752838 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:09:25 crc kubenswrapper[4780]: E0929 20:09:25.753723 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:09:27 crc kubenswrapper[4780]: I0929 20:09:27.499085 4780 scope.go:117] "RemoveContainer" containerID="20e896b97f0a24815659c6a08319ddca461008a924b321237ffb98cf131324cc" Sep 29 20:09:40 crc kubenswrapper[4780]: I0929 20:09:40.763919 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:09:40 crc kubenswrapper[4780]: E0929 20:09:40.765502 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:09:51 crc kubenswrapper[4780]: I0929 20:09:51.753700 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:09:51 crc kubenswrapper[4780]: E0929 20:09:51.754974 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:10:06 crc kubenswrapper[4780]: I0929 20:10:06.752808 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:10:06 crc kubenswrapper[4780]: E0929 20:10:06.753766 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:10:21 crc kubenswrapper[4780]: I0929 20:10:21.753354 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:10:21 crc kubenswrapper[4780]: E0929 20:10:21.754505 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:10:33 crc kubenswrapper[4780]: I0929 20:10:33.753146 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:10:33 crc kubenswrapper[4780]: E0929 20:10:33.754158 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:10:46 crc kubenswrapper[4780]: I0929 20:10:46.753614 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:10:46 crc kubenswrapper[4780]: E0929 20:10:46.754310 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:10:58 crc kubenswrapper[4780]: I0929 20:10:58.925304 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Sep 29 20:10:58 crc kubenswrapper[4780]: E0929 20:10:58.926705 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3710015-a813-47e7-a670-11fcc28c1901" containerName="mariadb-client-2" Sep 29 20:10:58 crc kubenswrapper[4780]: I0929 20:10:58.926738 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3710015-a813-47e7-a670-11fcc28c1901" containerName="mariadb-client-2" Sep 29 20:10:58 crc kubenswrapper[4780]: I0929 20:10:58.927154 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3710015-a813-47e7-a670-11fcc28c1901" containerName="mariadb-client-2" Sep 29 20:10:58 crc kubenswrapper[4780]: I0929 20:10:58.928276 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Sep 29 20:10:58 crc kubenswrapper[4780]: I0929 20:10:58.931820 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5lsnv" Sep 29 20:10:58 crc kubenswrapper[4780]: I0929 20:10:58.935292 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.049725 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6t79\" (UniqueName: \"kubernetes.io/projected/d2556c4e-903c-4377-97eb-0eb017939756-kube-api-access-m6t79\") pod \"mariadb-copy-data\" (UID: \"d2556c4e-903c-4377-97eb-0eb017939756\") " pod="openstack/mariadb-copy-data" Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.049991 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f73857f3-e778-46ba-af9a-c996b42207cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f73857f3-e778-46ba-af9a-c996b42207cb\") pod \"mariadb-copy-data\" (UID: \"d2556c4e-903c-4377-97eb-0eb017939756\") " pod="openstack/mariadb-copy-data" Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.151883 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6t79\" (UniqueName: \"kubernetes.io/projected/d2556c4e-903c-4377-97eb-0eb017939756-kube-api-access-m6t79\") pod \"mariadb-copy-data\" (UID: \"d2556c4e-903c-4377-97eb-0eb017939756\") " pod="openstack/mariadb-copy-data" Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.151967 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f73857f3-e778-46ba-af9a-c996b42207cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f73857f3-e778-46ba-af9a-c996b42207cb\") pod \"mariadb-copy-data\" (UID: \"d2556c4e-903c-4377-97eb-0eb017939756\") " pod="openstack/mariadb-copy-data" Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.157673 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.157763 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f73857f3-e778-46ba-af9a-c996b42207cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f73857f3-e778-46ba-af9a-c996b42207cb\") pod \"mariadb-copy-data\" (UID: \"d2556c4e-903c-4377-97eb-0eb017939756\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2da0d5d0e82ca352f1998f965b76e5ffa1b584b8d94e207c745559d15b1db65d/globalmount\"" pod="openstack/mariadb-copy-data" Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.179454 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6t79\" (UniqueName: \"kubernetes.io/projected/d2556c4e-903c-4377-97eb-0eb017939756-kube-api-access-m6t79\") pod \"mariadb-copy-data\" (UID: \"d2556c4e-903c-4377-97eb-0eb017939756\") " pod="openstack/mariadb-copy-data" Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.196741 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f73857f3-e778-46ba-af9a-c996b42207cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f73857f3-e778-46ba-af9a-c996b42207cb\") pod \"mariadb-copy-data\" (UID: \"d2556c4e-903c-4377-97eb-0eb017939756\") " pod="openstack/mariadb-copy-data" Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.275767 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.678855 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Sep 29 20:10:59 crc kubenswrapper[4780]: I0929 20:10:59.761233 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"d2556c4e-903c-4377-97eb-0eb017939756","Type":"ContainerStarted","Data":"99528c3533a124f549de9ea8fd5238cbe2c6effe86f5b270c1441e60e84975dd"} Sep 29 20:11:00 crc kubenswrapper[4780]: I0929 20:11:00.784943 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"d2556c4e-903c-4377-97eb-0eb017939756","Type":"ContainerStarted","Data":"9c63eb5e009018e8d849a0d8b1dbbe02c1cb4190d93a0ca787daf3176f63857c"} Sep 29 20:11:00 crc kubenswrapper[4780]: I0929 20:11:00.816952 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.81672385 podStartE2EDuration="3.81672385s" podCreationTimestamp="2025-09-29 20:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:11:00.803521372 +0000 UTC m=+5260.751819446" watchObservedRunningTime="2025-09-29 20:11:00.81672385 +0000 UTC m=+5260.765021924" Sep 29 20:11:01 crc kubenswrapper[4780]: I0929 20:11:01.754096 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:11:01 crc kubenswrapper[4780]: E0929 20:11:01.755212 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:11:02 crc kubenswrapper[4780]: I0929 20:11:02.447190 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:02 crc kubenswrapper[4780]: I0929 20:11:02.448101 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 29 20:11:02 crc kubenswrapper[4780]: I0929 20:11:02.457080 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:02 crc kubenswrapper[4780]: I0929 20:11:02.512759 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4g6x\" (UniqueName: \"kubernetes.io/projected/9ac1d4aa-7734-4b6a-9783-eec818d7fa70-kube-api-access-m4g6x\") pod \"mariadb-client\" (UID: \"9ac1d4aa-7734-4b6a-9783-eec818d7fa70\") " pod="openstack/mariadb-client" Sep 29 20:11:02 crc kubenswrapper[4780]: I0929 20:11:02.614655 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4g6x\" (UniqueName: \"kubernetes.io/projected/9ac1d4aa-7734-4b6a-9783-eec818d7fa70-kube-api-access-m4g6x\") pod \"mariadb-client\" (UID: \"9ac1d4aa-7734-4b6a-9783-eec818d7fa70\") " pod="openstack/mariadb-client" Sep 29 20:11:02 crc kubenswrapper[4780]: I0929 20:11:02.640812 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4g6x\" (UniqueName: \"kubernetes.io/projected/9ac1d4aa-7734-4b6a-9783-eec818d7fa70-kube-api-access-m4g6x\") pod \"mariadb-client\" (UID: \"9ac1d4aa-7734-4b6a-9783-eec818d7fa70\") " pod="openstack/mariadb-client" Sep 29 20:11:02 crc kubenswrapper[4780]: I0929 20:11:02.776216 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 29 20:11:03 crc kubenswrapper[4780]: I0929 20:11:03.290433 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:03 crc kubenswrapper[4780]: W0929 20:11:03.297416 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ac1d4aa_7734_4b6a_9783_eec818d7fa70.slice/crio-ec752d9ec7ac3905a273a3e6b5aa923cb38bdf40b4c45f1ef52565b532a95f56 WatchSource:0}: Error finding container ec752d9ec7ac3905a273a3e6b5aa923cb38bdf40b4c45f1ef52565b532a95f56: Status 404 returned error can't find the container with id ec752d9ec7ac3905a273a3e6b5aa923cb38bdf40b4c45f1ef52565b532a95f56 Sep 29 20:11:03 crc kubenswrapper[4780]: I0929 20:11:03.815020 4780 generic.go:334] "Generic (PLEG): container finished" podID="9ac1d4aa-7734-4b6a-9783-eec818d7fa70" containerID="16275540018b8a60a11155bd2fa9fd66463ad312af6c0ee038dc92e0317425ec" exitCode=0 Sep 29 20:11:03 crc kubenswrapper[4780]: I0929 20:11:03.815147 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9ac1d4aa-7734-4b6a-9783-eec818d7fa70","Type":"ContainerDied","Data":"16275540018b8a60a11155bd2fa9fd66463ad312af6c0ee038dc92e0317425ec"} Sep 29 20:11:03 crc kubenswrapper[4780]: I0929 20:11:03.815444 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9ac1d4aa-7734-4b6a-9783-eec818d7fa70","Type":"ContainerStarted","Data":"ec752d9ec7ac3905a273a3e6b5aa923cb38bdf40b4c45f1ef52565b532a95f56"} Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.301919 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.330883 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_9ac1d4aa-7734-4b6a-9783-eec818d7fa70/mariadb-client/0.log" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.363057 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.366522 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4g6x\" (UniqueName: \"kubernetes.io/projected/9ac1d4aa-7734-4b6a-9783-eec818d7fa70-kube-api-access-m4g6x\") pod \"9ac1d4aa-7734-4b6a-9783-eec818d7fa70\" (UID: \"9ac1d4aa-7734-4b6a-9783-eec818d7fa70\") " Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.368508 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.375001 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac1d4aa-7734-4b6a-9783-eec818d7fa70-kube-api-access-m4g6x" (OuterVolumeSpecName: "kube-api-access-m4g6x") pod "9ac1d4aa-7734-4b6a-9783-eec818d7fa70" (UID: "9ac1d4aa-7734-4b6a-9783-eec818d7fa70"). InnerVolumeSpecName "kube-api-access-m4g6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.471733 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4g6x\" (UniqueName: \"kubernetes.io/projected/9ac1d4aa-7734-4b6a-9783-eec818d7fa70-kube-api-access-m4g6x\") on node \"crc\" DevicePath \"\"" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.490147 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:05 crc kubenswrapper[4780]: E0929 20:11:05.490648 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac1d4aa-7734-4b6a-9783-eec818d7fa70" containerName="mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.490676 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac1d4aa-7734-4b6a-9783-eec818d7fa70" containerName="mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.490970 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac1d4aa-7734-4b6a-9783-eec818d7fa70" containerName="mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.491838 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.502125 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.573382 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-966zt\" (UniqueName: \"kubernetes.io/projected/2c5b1baf-3980-4b2e-b846-f7c2a085118b-kube-api-access-966zt\") pod \"mariadb-client\" (UID: \"2c5b1baf-3980-4b2e-b846-f7c2a085118b\") " pod="openstack/mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.675356 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-966zt\" (UniqueName: \"kubernetes.io/projected/2c5b1baf-3980-4b2e-b846-f7c2a085118b-kube-api-access-966zt\") pod \"mariadb-client\" (UID: \"2c5b1baf-3980-4b2e-b846-f7c2a085118b\") " pod="openstack/mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.704738 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-966zt\" (UniqueName: \"kubernetes.io/projected/2c5b1baf-3980-4b2e-b846-f7c2a085118b-kube-api-access-966zt\") pod \"mariadb-client\" (UID: \"2c5b1baf-3980-4b2e-b846-f7c2a085118b\") " pod="openstack/mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.824125 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.842807 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec752d9ec7ac3905a273a3e6b5aa923cb38bdf40b4c45f1ef52565b532a95f56" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.842896 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 29 20:11:05 crc kubenswrapper[4780]: I0929 20:11:05.923032 4780 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="9ac1d4aa-7734-4b6a-9783-eec818d7fa70" podUID="2c5b1baf-3980-4b2e-b846-f7c2a085118b" Sep 29 20:11:06 crc kubenswrapper[4780]: I0929 20:11:06.201330 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:06 crc kubenswrapper[4780]: W0929 20:11:06.206568 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c5b1baf_3980_4b2e_b846_f7c2a085118b.slice/crio-f3c7f0a4672a2998f8fe06e95354c5a30ab31fa1a1b00a17d370a7c135b83e0a WatchSource:0}: Error finding container f3c7f0a4672a2998f8fe06e95354c5a30ab31fa1a1b00a17d370a7c135b83e0a: Status 404 returned error can't find the container with id f3c7f0a4672a2998f8fe06e95354c5a30ab31fa1a1b00a17d370a7c135b83e0a Sep 29 20:11:06 crc kubenswrapper[4780]: I0929 20:11:06.768949 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac1d4aa-7734-4b6a-9783-eec818d7fa70" path="/var/lib/kubelet/pods/9ac1d4aa-7734-4b6a-9783-eec818d7fa70/volumes" Sep 29 20:11:06 crc kubenswrapper[4780]: I0929 20:11:06.853419 4780 generic.go:334] "Generic (PLEG): container finished" podID="2c5b1baf-3980-4b2e-b846-f7c2a085118b" containerID="4d36f21a8fb1b5d9f7d41f261f3e101c1e6fe1285895768325e234e75ee31daf" exitCode=0 Sep 29 20:11:06 crc kubenswrapper[4780]: I0929 20:11:06.853499 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2c5b1baf-3980-4b2e-b846-f7c2a085118b","Type":"ContainerDied","Data":"4d36f21a8fb1b5d9f7d41f261f3e101c1e6fe1285895768325e234e75ee31daf"} Sep 29 20:11:06 crc kubenswrapper[4780]: I0929 20:11:06.853568 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2c5b1baf-3980-4b2e-b846-f7c2a085118b","Type":"ContainerStarted","Data":"f3c7f0a4672a2998f8fe06e95354c5a30ab31fa1a1b00a17d370a7c135b83e0a"} Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.266181 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.320078 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_2c5b1baf-3980-4b2e-b846-f7c2a085118b/mariadb-client/0.log" Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.324773 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-966zt\" (UniqueName: \"kubernetes.io/projected/2c5b1baf-3980-4b2e-b846-f7c2a085118b-kube-api-access-966zt\") pod \"2c5b1baf-3980-4b2e-b846-f7c2a085118b\" (UID: \"2c5b1baf-3980-4b2e-b846-f7c2a085118b\") " Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.334897 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c5b1baf-3980-4b2e-b846-f7c2a085118b-kube-api-access-966zt" (OuterVolumeSpecName: "kube-api-access-966zt") pod "2c5b1baf-3980-4b2e-b846-f7c2a085118b" (UID: "2c5b1baf-3980-4b2e-b846-f7c2a085118b"). InnerVolumeSpecName "kube-api-access-966zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.350694 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.360076 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.426985 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-966zt\" (UniqueName: \"kubernetes.io/projected/2c5b1baf-3980-4b2e-b846-f7c2a085118b-kube-api-access-966zt\") on node \"crc\" DevicePath \"\"" Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.766220 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c5b1baf-3980-4b2e-b846-f7c2a085118b" path="/var/lib/kubelet/pods/2c5b1baf-3980-4b2e-b846-f7c2a085118b/volumes" Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.885504 4780 scope.go:117] "RemoveContainer" containerID="4d36f21a8fb1b5d9f7d41f261f3e101c1e6fe1285895768325e234e75ee31daf" Sep 29 20:11:08 crc kubenswrapper[4780]: I0929 20:11:08.885877 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Sep 29 20:11:12 crc kubenswrapper[4780]: I0929 20:11:12.754019 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:11:12 crc kubenswrapper[4780]: E0929 20:11:12.755095 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:11:25 crc kubenswrapper[4780]: I0929 20:11:25.754661 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:11:25 crc kubenswrapper[4780]: E0929 20:11:25.755976 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:11:27 crc kubenswrapper[4780]: I0929 20:11:27.600795 4780 scope.go:117] "RemoveContainer" containerID="92757498b50f98edc1a9df4bd51360d9991efe26e6409aad81aa0557e0bc67cb" Sep 29 20:11:36 crc kubenswrapper[4780]: I0929 20:11:36.753875 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:11:37 crc kubenswrapper[4780]: I0929 20:11:37.179521 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"3c7d0867cfa2f7173f305d5b92aa3ffce4dd0a0e42d21fcd3573872eb7ac90e5"} Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.948748 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 20:11:40 crc kubenswrapper[4780]: E0929 20:11:40.949661 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c5b1baf-3980-4b2e-b846-f7c2a085118b" containerName="mariadb-client" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.949675 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c5b1baf-3980-4b2e-b846-f7c2a085118b" containerName="mariadb-client" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.949861 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c5b1baf-3980-4b2e-b846-f7c2a085118b" containerName="mariadb-client" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.950769 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.953031 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.955307 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.955335 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.955341 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.956984 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-7885n" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.960044 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.988120 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.990644 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.994562 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Sep 29 20:11:40 crc kubenswrapper[4780]: I0929 20:11:40.996766 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.009300 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpxvn\" (UniqueName: \"kubernetes.io/projected/bcafc040-9d77-4ea9-9515-cba560d9a6ca-kube-api-access-bpxvn\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.009395 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bcafc040-9d77-4ea9-9515-cba560d9a6ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.009441 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcafc040-9d77-4ea9-9515-cba560d9a6ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.009467 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcafc040-9d77-4ea9-9515-cba560d9a6ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.009494 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcafc040-9d77-4ea9-9515-cba560d9a6ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.009538 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b7216be4-879e-445f-acba-4ca19f0d7aa5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7216be4-879e-445f-acba-4ca19f0d7aa5\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.009560 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcafc040-9d77-4ea9-9515-cba560d9a6ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.009598 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcafc040-9d77-4ea9-9515-cba560d9a6ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.021018 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.027910 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111254 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493a3b25-de4a-438c-b19e-210f6618c08d-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111346 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493a3b25-de4a-438c-b19e-210f6618c08d-config\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111413 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/84151286-7e33-474f-9247-d1222cae1067-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111499 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-19b44184-10a2-4c02-8464-1c8e69a770c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19b44184-10a2-4c02-8464-1c8e69a770c7\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111585 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493a3b25-de4a-438c-b19e-210f6618c08d-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111619 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/493a3b25-de4a-438c-b19e-210f6618c08d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111691 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bcafc040-9d77-4ea9-9515-cba560d9a6ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111747 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84151286-7e33-474f-9247-d1222cae1067-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111789 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcafc040-9d77-4ea9-9515-cba560d9a6ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111865 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/84151286-7e33-474f-9247-d1222cae1067-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111932 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcafc040-9d77-4ea9-9515-cba560d9a6ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.111961 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr8nr\" (UniqueName: \"kubernetes.io/projected/493a3b25-de4a-438c-b19e-210f6618c08d-kube-api-access-sr8nr\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112007 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcafc040-9d77-4ea9-9515-cba560d9a6ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112084 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/493a3b25-de4a-438c-b19e-210f6618c08d-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112117 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b7216be4-879e-445f-acba-4ca19f0d7aa5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7216be4-879e-445f-acba-4ca19f0d7aa5\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112163 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcafc040-9d77-4ea9-9515-cba560d9a6ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112200 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grcj8\" (UniqueName: \"kubernetes.io/projected/84151286-7e33-474f-9247-d1222cae1067-kube-api-access-grcj8\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112271 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84151286-7e33-474f-9247-d1222cae1067-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112306 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84151286-7e33-474f-9247-d1222cae1067-config\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112383 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcafc040-9d77-4ea9-9515-cba560d9a6ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112415 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bcafc040-9d77-4ea9-9515-cba560d9a6ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112436 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84151286-7e33-474f-9247-d1222cae1067-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.112545 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b7e445f5-3c76-48a7-831a-421fba1d90a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7e445f5-3c76-48a7-831a-421fba1d90a2\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.113294 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpxvn\" (UniqueName: \"kubernetes.io/projected/bcafc040-9d77-4ea9-9515-cba560d9a6ca-kube-api-access-bpxvn\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.113384 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/493a3b25-de4a-438c-b19e-210f6618c08d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.113811 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcafc040-9d77-4ea9-9515-cba560d9a6ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.114013 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcafc040-9d77-4ea9-9515-cba560d9a6ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.117610 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.117675 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b7216be4-879e-445f-acba-4ca19f0d7aa5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7216be4-879e-445f-acba-4ca19f0d7aa5\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f267759b70a7ebed15db55af478dc49518e9df9e0d3c825ea8aa8cfa13889cc/globalmount\"" pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.121114 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcafc040-9d77-4ea9-9515-cba560d9a6ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.122134 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcafc040-9d77-4ea9-9515-cba560d9a6ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.126552 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcafc040-9d77-4ea9-9515-cba560d9a6ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.133090 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpxvn\" (UniqueName: \"kubernetes.io/projected/bcafc040-9d77-4ea9-9515-cba560d9a6ca-kube-api-access-bpxvn\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.172647 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b7216be4-879e-445f-acba-4ca19f0d7aa5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7216be4-879e-445f-acba-4ca19f0d7aa5\") pod \"ovsdbserver-nb-0\" (UID: \"bcafc040-9d77-4ea9-9515-cba560d9a6ca\") " pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.214959 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84151286-7e33-474f-9247-d1222cae1067-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215029 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b7e445f5-3c76-48a7-831a-421fba1d90a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7e445f5-3c76-48a7-831a-421fba1d90a2\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215121 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/493a3b25-de4a-438c-b19e-210f6618c08d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215158 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493a3b25-de4a-438c-b19e-210f6618c08d-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215185 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493a3b25-de4a-438c-b19e-210f6618c08d-config\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215212 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/84151286-7e33-474f-9247-d1222cae1067-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215243 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-19b44184-10a2-4c02-8464-1c8e69a770c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19b44184-10a2-4c02-8464-1c8e69a770c7\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215282 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493a3b25-de4a-438c-b19e-210f6618c08d-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215312 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/493a3b25-de4a-438c-b19e-210f6618c08d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215363 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84151286-7e33-474f-9247-d1222cae1067-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215406 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/84151286-7e33-474f-9247-d1222cae1067-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215444 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr8nr\" (UniqueName: \"kubernetes.io/projected/493a3b25-de4a-438c-b19e-210f6618c08d-kube-api-access-sr8nr\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215505 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/493a3b25-de4a-438c-b19e-210f6618c08d-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215539 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grcj8\" (UniqueName: \"kubernetes.io/projected/84151286-7e33-474f-9247-d1222cae1067-kube-api-access-grcj8\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215573 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84151286-7e33-474f-9247-d1222cae1067-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.215606 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84151286-7e33-474f-9247-d1222cae1067-config\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.216736 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84151286-7e33-474f-9247-d1222cae1067-config\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.218142 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/84151286-7e33-474f-9247-d1222cae1067-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.218193 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/493a3b25-de4a-438c-b19e-210f6618c08d-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.218824 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84151286-7e33-474f-9247-d1222cae1067-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.218871 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/493a3b25-de4a-438c-b19e-210f6618c08d-config\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.218950 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/84151286-7e33-474f-9247-d1222cae1067-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.219563 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493a3b25-de4a-438c-b19e-210f6618c08d-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.223416 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493a3b25-de4a-438c-b19e-210f6618c08d-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.223531 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/493a3b25-de4a-438c-b19e-210f6618c08d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.223663 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.223982 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-19b44184-10a2-4c02-8464-1c8e69a770c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19b44184-10a2-4c02-8464-1c8e69a770c7\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/279b4e32271393aee86731e8076746bce36296a523542b6a9e35e779116434be/globalmount\"" pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.224596 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/493a3b25-de4a-438c-b19e-210f6618c08d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.224990 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84151286-7e33-474f-9247-d1222cae1067-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.231759 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/84151286-7e33-474f-9247-d1222cae1067-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.231903 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.231938 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b7e445f5-3c76-48a7-831a-421fba1d90a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7e445f5-3c76-48a7-831a-421fba1d90a2\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7b062726ddc1c7b8aa89d85fada5cb4236858833460537834dc636658a2e529f/globalmount\"" pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.238691 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr8nr\" (UniqueName: \"kubernetes.io/projected/493a3b25-de4a-438c-b19e-210f6618c08d-kube-api-access-sr8nr\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.245275 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grcj8\" (UniqueName: \"kubernetes.io/projected/84151286-7e33-474f-9247-d1222cae1067-kube-api-access-grcj8\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.257073 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-19b44184-10a2-4c02-8464-1c8e69a770c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19b44184-10a2-4c02-8464-1c8e69a770c7\") pod \"ovsdbserver-nb-2\" (UID: \"493a3b25-de4a-438c-b19e-210f6618c08d\") " pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.271702 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b7e445f5-3c76-48a7-831a-421fba1d90a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7e445f5-3c76-48a7-831a-421fba1d90a2\") pod \"ovsdbserver-nb-1\" (UID: \"84151286-7e33-474f-9247-d1222cae1067\") " pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.275236 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.318279 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.326810 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.636449 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 29 20:11:41 crc kubenswrapper[4780]: W0929 20:11:41.639996 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcafc040_9d77_4ea9_9515_cba560d9a6ca.slice/crio-e03acc0eb12cb0540188f5d7657bd8364f4ba9f0fd23ad5185006f2f18c5fa10 WatchSource:0}: Error finding container e03acc0eb12cb0540188f5d7657bd8364f4ba9f0fd23ad5185006f2f18c5fa10: Status 404 returned error can't find the container with id e03acc0eb12cb0540188f5d7657bd8364f4ba9f0fd23ad5185006f2f18c5fa10 Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.805608 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.807724 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.812682 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.812820 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.813422 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.813602 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5tqdq" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.834323 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.838500 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.846194 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.881813 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.884714 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.893349 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.902869 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.928235 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac41da62-3218-48ed-b10d-144bf1f0e85f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.928725 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac41da62-3218-48ed-b10d-144bf1f0e85f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.928909 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac41da62-3218-48ed-b10d-144bf1f0e85f-config\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.929004 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac41da62-3218-48ed-b10d-144bf1f0e85f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.929084 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac41da62-3218-48ed-b10d-144bf1f0e85f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.929155 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c96b9a90-abd9-45ec-bae3-1f700fdceb69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c96b9a90-abd9-45ec-bae3-1f700fdceb69\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.929226 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbcj\" (UniqueName: \"kubernetes.io/projected/ac41da62-3218-48ed-b10d-144bf1f0e85f-kube-api-access-7jbcj\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.929316 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac41da62-3218-48ed-b10d-144bf1f0e85f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:41 crc kubenswrapper[4780]: I0929 20:11:41.936744 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.000826 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Sep 29 20:11:42 crc kubenswrapper[4780]: W0929 20:11:42.011822 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493a3b25_de4a_438c_b19e_210f6618c08d.slice/crio-44c4c2153eaf208a1225f76c977b3cdce683c6634c44cc938427682635d87701 WatchSource:0}: Error finding container 44c4c2153eaf208a1225f76c977b3cdce683c6634c44cc938427682635d87701: Status 404 returned error can't find the container with id 44c4c2153eaf208a1225f76c977b3cdce683c6634c44cc938427682635d87701 Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030445 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac41da62-3218-48ed-b10d-144bf1f0e85f-config\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030488 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e63d364-5147-4383-ae55-fde7de3b1894-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030514 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac41da62-3218-48ed-b10d-144bf1f0e85f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030530 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac41da62-3218-48ed-b10d-144bf1f0e85f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030558 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c96b9a90-abd9-45ec-bae3-1f700fdceb69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c96b9a90-abd9-45ec-bae3-1f700fdceb69\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030577 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbcj\" (UniqueName: \"kubernetes.io/projected/ac41da62-3218-48ed-b10d-144bf1f0e85f-kube-api-access-7jbcj\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030596 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-97f361ad-eecd-4e7c-b79e-81f34c915f9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f361ad-eecd-4e7c-b79e-81f34c915f9d\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030625 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac41da62-3218-48ed-b10d-144bf1f0e85f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030653 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-653448b4-e367-4b8b-9481-d3751413dfd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-653448b4-e367-4b8b-9481-d3751413dfd1\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030667 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e63d364-5147-4383-ae55-fde7de3b1894-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030684 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e63d364-5147-4383-ae55-fde7de3b1894-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030702 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6329f7-7f65-48b4-9a99-3fe225456e58-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.030719 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e63d364-5147-4383-ae55-fde7de3b1894-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.031348 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac41da62-3218-48ed-b10d-144bf1f0e85f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.031689 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e63d364-5147-4383-ae55-fde7de3b1894-config\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033256 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc6329f7-7f65-48b4-9a99-3fe225456e58-config\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033297 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac41da62-3218-48ed-b10d-144bf1f0e85f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033349 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6329f7-7f65-48b4-9a99-3fe225456e58-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033367 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc6329f7-7f65-48b4-9a99-3fe225456e58-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033441 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e63d364-5147-4383-ae55-fde7de3b1894-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033466 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6329f7-7f65-48b4-9a99-3fe225456e58-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033504 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6329f7-7f65-48b4-9a99-3fe225456e58-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033522 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn29j\" (UniqueName: \"kubernetes.io/projected/0e63d364-5147-4383-ae55-fde7de3b1894-kube-api-access-tn29j\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033563 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac41da62-3218-48ed-b10d-144bf1f0e85f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.033645 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkzfw\" (UniqueName: \"kubernetes.io/projected/fc6329f7-7f65-48b4-9a99-3fe225456e58-kube-api-access-bkzfw\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.032957 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac41da62-3218-48ed-b10d-144bf1f0e85f-config\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.036147 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.036185 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c96b9a90-abd9-45ec-bae3-1f700fdceb69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c96b9a90-abd9-45ec-bae3-1f700fdceb69\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3a17d54f42cb30d4f99a3f96027a7cf2b6650618b0b7da42845b0bdade505045/globalmount\"" pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.036212 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac41da62-3218-48ed-b10d-144bf1f0e85f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.037323 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac41da62-3218-48ed-b10d-144bf1f0e85f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.037653 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac41da62-3218-48ed-b10d-144bf1f0e85f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.042983 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac41da62-3218-48ed-b10d-144bf1f0e85f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.049622 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbcj\" (UniqueName: \"kubernetes.io/projected/ac41da62-3218-48ed-b10d-144bf1f0e85f-kube-api-access-7jbcj\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.073338 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c96b9a90-abd9-45ec-bae3-1f700fdceb69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c96b9a90-abd9-45ec-bae3-1f700fdceb69\") pod \"ovsdbserver-sb-0\" (UID: \"ac41da62-3218-48ed-b10d-144bf1f0e85f\") " pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135338 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkzfw\" (UniqueName: \"kubernetes.io/projected/fc6329f7-7f65-48b4-9a99-3fe225456e58-kube-api-access-bkzfw\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135444 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e63d364-5147-4383-ae55-fde7de3b1894-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135524 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-97f361ad-eecd-4e7c-b79e-81f34c915f9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f361ad-eecd-4e7c-b79e-81f34c915f9d\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135610 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-653448b4-e367-4b8b-9481-d3751413dfd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-653448b4-e367-4b8b-9481-d3751413dfd1\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135658 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e63d364-5147-4383-ae55-fde7de3b1894-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135685 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e63d364-5147-4383-ae55-fde7de3b1894-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135705 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6329f7-7f65-48b4-9a99-3fe225456e58-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135751 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e63d364-5147-4383-ae55-fde7de3b1894-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135828 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc6329f7-7f65-48b4-9a99-3fe225456e58-config\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135864 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e63d364-5147-4383-ae55-fde7de3b1894-config\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135919 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6329f7-7f65-48b4-9a99-3fe225456e58-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135939 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc6329f7-7f65-48b4-9a99-3fe225456e58-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.135993 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e63d364-5147-4383-ae55-fde7de3b1894-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.136015 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6329f7-7f65-48b4-9a99-3fe225456e58-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.136073 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn29j\" (UniqueName: \"kubernetes.io/projected/0e63d364-5147-4383-ae55-fde7de3b1894-kube-api-access-tn29j\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.136095 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6329f7-7f65-48b4-9a99-3fe225456e58-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.137682 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc6329f7-7f65-48b4-9a99-3fe225456e58-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.137793 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e63d364-5147-4383-ae55-fde7de3b1894-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.138416 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e63d364-5147-4383-ae55-fde7de3b1894-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.138886 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc6329f7-7f65-48b4-9a99-3fe225456e58-config\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.139495 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6329f7-7f65-48b4-9a99-3fe225456e58-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.140518 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.140550 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-97f361ad-eecd-4e7c-b79e-81f34c915f9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f361ad-eecd-4e7c-b79e-81f34c915f9d\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/abe76539e5cb6228bec062bfb46e1ce304ed5d69f15084dc530b85905f36372a/globalmount\"" pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.141107 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.141133 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-653448b4-e367-4b8b-9481-d3751413dfd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-653448b4-e367-4b8b-9481-d3751413dfd1\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a80931d19692c2c8d088528b26dbc50226283658fde6d22d80a4de25278684b2/globalmount\"" pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.141372 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6329f7-7f65-48b4-9a99-3fe225456e58-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.142014 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e63d364-5147-4383-ae55-fde7de3b1894-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.142792 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e63d364-5147-4383-ae55-fde7de3b1894-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.142858 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6329f7-7f65-48b4-9a99-3fe225456e58-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.150250 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.150521 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6329f7-7f65-48b4-9a99-3fe225456e58-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.150607 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e63d364-5147-4383-ae55-fde7de3b1894-config\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.156301 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkzfw\" (UniqueName: \"kubernetes.io/projected/fc6329f7-7f65-48b4-9a99-3fe225456e58-kube-api-access-bkzfw\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.157888 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e63d364-5147-4383-ae55-fde7de3b1894-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.158626 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn29j\" (UniqueName: \"kubernetes.io/projected/0e63d364-5147-4383-ae55-fde7de3b1894-kube-api-access-tn29j\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.175446 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-97f361ad-eecd-4e7c-b79e-81f34c915f9d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f361ad-eecd-4e7c-b79e-81f34c915f9d\") pod \"ovsdbserver-sb-1\" (UID: \"fc6329f7-7f65-48b4-9a99-3fe225456e58\") " pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.177041 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-653448b4-e367-4b8b-9481-d3751413dfd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-653448b4-e367-4b8b-9481-d3751413dfd1\") pod \"ovsdbserver-sb-2\" (UID: \"0e63d364-5147-4383-ae55-fde7de3b1894\") " pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.179117 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.218706 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.228940 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bcafc040-9d77-4ea9-9515-cba560d9a6ca","Type":"ContainerStarted","Data":"0f12b88143f2539b413ebb6137c40906f301f7b89e574ddb26104a62497005a9"} Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.228984 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bcafc040-9d77-4ea9-9515-cba560d9a6ca","Type":"ContainerStarted","Data":"e571fdcf07be74217b1fa68dbd91146114177bb5cdccdf84ab9cf9d2b397867d"} Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.228995 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bcafc040-9d77-4ea9-9515-cba560d9a6ca","Type":"ContainerStarted","Data":"e03acc0eb12cb0540188f5d7657bd8364f4ba9f0fd23ad5185006f2f18c5fa10"} Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.229789 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"493a3b25-de4a-438c-b19e-210f6618c08d","Type":"ContainerStarted","Data":"44c4c2153eaf208a1225f76c977b3cdce683c6634c44cc938427682635d87701"} Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.232590 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"84151286-7e33-474f-9247-d1222cae1067","Type":"ContainerStarted","Data":"8ed98af2ef7deb1e06c4bd22f8c4f722c47f881515f34556bc2a850918e89409"} Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.232654 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"84151286-7e33-474f-9247-d1222cae1067","Type":"ContainerStarted","Data":"dbb8b5399e0c445914c93fa7d097272c561db44f8f00ebbdeb5b96f22da548d7"} Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.254190 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.2541736119999998 podStartE2EDuration="3.254173612s" podCreationTimestamp="2025-09-29 20:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:11:42.251699011 +0000 UTC m=+5302.199997045" watchObservedRunningTime="2025-09-29 20:11:42.254173612 +0000 UTC m=+5302.202471656" Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.581646 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Sep 29 20:11:42 crc kubenswrapper[4780]: W0929 20:11:42.584943 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc6329f7_7f65_48b4_9a99_3fe225456e58.slice/crio-2a09d1d1d85b10a48a1695c54a1d9135ede3e1ea3306130eebfa89855538a563 WatchSource:0}: Error finding container 2a09d1d1d85b10a48a1695c54a1d9135ede3e1ea3306130eebfa89855538a563: Status 404 returned error can't find the container with id 2a09d1d1d85b10a48a1695c54a1d9135ede3e1ea3306130eebfa89855538a563 Sep 29 20:11:42 crc kubenswrapper[4780]: I0929 20:11:42.677605 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Sep 29 20:11:42 crc kubenswrapper[4780]: W0929 20:11:42.693876 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e63d364_5147_4383_ae55_fde7de3b1894.slice/crio-0b2857ddd6d06a62f6a830867ee4a33366977b6b1cfcd7fd01fc616cd01584ae WatchSource:0}: Error finding container 0b2857ddd6d06a62f6a830867ee4a33366977b6b1cfcd7fd01fc616cd01584ae: Status 404 returned error can't find the container with id 0b2857ddd6d06a62f6a830867ee4a33366977b6b1cfcd7fd01fc616cd01584ae Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.243105 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"0e63d364-5147-4383-ae55-fde7de3b1894","Type":"ContainerStarted","Data":"bca747bc552e52ce4e28ec96dd48c2b89148952e055b74385f1ce921165da530"} Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.243457 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"0e63d364-5147-4383-ae55-fde7de3b1894","Type":"ContainerStarted","Data":"3ace24b278efccecc31cadde5f491a8f7540a5af5eb3210440eff4fb87984ed4"} Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.243493 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"0e63d364-5147-4383-ae55-fde7de3b1894","Type":"ContainerStarted","Data":"0b2857ddd6d06a62f6a830867ee4a33366977b6b1cfcd7fd01fc616cd01584ae"} Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.246732 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"84151286-7e33-474f-9247-d1222cae1067","Type":"ContainerStarted","Data":"6a1806f908814a14326a5a28eaaab312c7a4e52eecfc07bf294a8325050208a1"} Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.249396 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"493a3b25-de4a-438c-b19e-210f6618c08d","Type":"ContainerStarted","Data":"012257dd6fa3e912a418625fe09bace80b079c3a6a3d15ec7ec94c635a7e160d"} Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.249445 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"493a3b25-de4a-438c-b19e-210f6618c08d","Type":"ContainerStarted","Data":"5d4775be79c22b62d1bb645721e8df15c7bbde9be552abaa956b7dd9bee54fc7"} Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.251396 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fc6329f7-7f65-48b4-9a99-3fe225456e58","Type":"ContainerStarted","Data":"f9fbc5bcddd1ee145c30fb46b5fa79d8a97891356136abf834598580c19c3eb8"} Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.251534 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fc6329f7-7f65-48b4-9a99-3fe225456e58","Type":"ContainerStarted","Data":"099f89808df15332c7ee2e3d4ff7271937d3435b0a10fb5f4117417536006a6d"} Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.251627 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fc6329f7-7f65-48b4-9a99-3fe225456e58","Type":"ContainerStarted","Data":"2a09d1d1d85b10a48a1695c54a1d9135ede3e1ea3306130eebfa89855538a563"} Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.269609 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.269578221 podStartE2EDuration="3.269578221s" podCreationTimestamp="2025-09-29 20:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:11:43.261015545 +0000 UTC m=+5303.209313609" watchObservedRunningTime="2025-09-29 20:11:43.269578221 +0000 UTC m=+5303.217876285" Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.283774 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.2837448160000005 podStartE2EDuration="4.283744816s" podCreationTimestamp="2025-09-29 20:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:11:43.282808529 +0000 UTC m=+5303.231106583" watchObservedRunningTime="2025-09-29 20:11:43.283744816 +0000 UTC m=+5303.232042870" Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.303653 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.303630856 podStartE2EDuration="4.303630856s" podCreationTimestamp="2025-09-29 20:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:11:43.301189996 +0000 UTC m=+5303.249488040" watchObservedRunningTime="2025-09-29 20:11:43.303630856 +0000 UTC m=+5303.251928900" Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.329272 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.329252149 podStartE2EDuration="3.329252149s" podCreationTimestamp="2025-09-29 20:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:11:43.324578916 +0000 UTC m=+5303.272877000" watchObservedRunningTime="2025-09-29 20:11:43.329252149 +0000 UTC m=+5303.277550193" Sep 29 20:11:43 crc kubenswrapper[4780]: I0929 20:11:43.516953 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 29 20:11:43 crc kubenswrapper[4780]: W0929 20:11:43.531230 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac41da62_3218_48ed_b10d_144bf1f0e85f.slice/crio-f1fc3c4bfae3694a28e728d1e0b06c830e87d843233b4ef6371b97b50636aca8 WatchSource:0}: Error finding container f1fc3c4bfae3694a28e728d1e0b06c830e87d843233b4ef6371b97b50636aca8: Status 404 returned error can't find the container with id f1fc3c4bfae3694a28e728d1e0b06c830e87d843233b4ef6371b97b50636aca8 Sep 29 20:11:44 crc kubenswrapper[4780]: I0929 20:11:44.268130 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ac41da62-3218-48ed-b10d-144bf1f0e85f","Type":"ContainerStarted","Data":"4ce722389d98771f82ad10064165b976de04d6c470fce378bda1e243279d6161"} Sep 29 20:11:44 crc kubenswrapper[4780]: I0929 20:11:44.268613 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ac41da62-3218-48ed-b10d-144bf1f0e85f","Type":"ContainerStarted","Data":"e8cacc5fe3e2c746bb7ce13f6ac203ae8eccc40298248aff3edfa1f8c9b53d71"} Sep 29 20:11:44 crc kubenswrapper[4780]: I0929 20:11:44.268674 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ac41da62-3218-48ed-b10d-144bf1f0e85f","Type":"ContainerStarted","Data":"f1fc3c4bfae3694a28e728d1e0b06c830e87d843233b4ef6371b97b50636aca8"} Sep 29 20:11:44 crc kubenswrapper[4780]: I0929 20:11:44.275829 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:44 crc kubenswrapper[4780]: I0929 20:11:44.307224 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.307197837 podStartE2EDuration="4.307197837s" podCreationTimestamp="2025-09-29 20:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:11:44.298464646 +0000 UTC m=+5304.246762750" watchObservedRunningTime="2025-09-29 20:11:44.307197837 +0000 UTC m=+5304.255495921" Sep 29 20:11:44 crc kubenswrapper[4780]: I0929 20:11:44.319190 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:44 crc kubenswrapper[4780]: I0929 20:11:44.327367 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:44 crc kubenswrapper[4780]: I0929 20:11:44.339105 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:45 crc kubenswrapper[4780]: I0929 20:11:45.150769 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:45 crc kubenswrapper[4780]: I0929 20:11:45.179746 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:45 crc kubenswrapper[4780]: I0929 20:11:45.220704 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:45 crc kubenswrapper[4780]: I0929 20:11:45.248725 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:45 crc kubenswrapper[4780]: I0929 20:11:45.280113 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:45 crc kubenswrapper[4780]: I0929 20:11:45.281676 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:45 crc kubenswrapper[4780]: I0929 20:11:45.290584 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.297237 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.320819 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.327027 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.371583 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.685764 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-789b77bc97-7q5sn"] Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.687258 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.699307 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-789b77bc97-7q5sn"] Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.700002 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.742178 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-dns-svc\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.742232 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-config\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.742258 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr7kh\" (UniqueName: \"kubernetes.io/projected/8cdf2222-fb53-488c-91d2-71060756dce8-kube-api-access-wr7kh\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.742456 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-ovsdbserver-nb\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.844108 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-dns-svc\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.844159 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-config\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.844185 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr7kh\" (UniqueName: \"kubernetes.io/projected/8cdf2222-fb53-488c-91d2-71060756dce8-kube-api-access-wr7kh\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.844241 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-ovsdbserver-nb\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.844994 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-dns-svc\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.845706 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-ovsdbserver-nb\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.845738 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-config\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:46 crc kubenswrapper[4780]: I0929 20:11:46.864585 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr7kh\" (UniqueName: \"kubernetes.io/projected/8cdf2222-fb53-488c-91d2-71060756dce8-kube-api-access-wr7kh\") pod \"dnsmasq-dns-789b77bc97-7q5sn\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.016309 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.151575 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.255844 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.279593 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.388673 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.495932 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.537906 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.567365 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-789b77bc97-7q5sn"] Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.580975 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.657411 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-789b77bc97-7q5sn"] Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.694294 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7986d5d7c7-29fh6"] Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.700256 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.704275 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.711505 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7986d5d7c7-29fh6"] Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.792839 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsf4c\" (UniqueName: \"kubernetes.io/projected/a0d67219-f8e3-4372-b3ca-dd003fe11375-kube-api-access-dsf4c\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.793261 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-sb\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.793285 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-config\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.793330 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-dns-svc\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.793356 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-nb\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.894807 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-sb\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.894854 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-config\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.894895 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-dns-svc\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.894926 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-nb\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.895012 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsf4c\" (UniqueName: \"kubernetes.io/projected/a0d67219-f8e3-4372-b3ca-dd003fe11375-kube-api-access-dsf4c\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.895807 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-sb\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.897998 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-dns-svc\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.898299 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-config\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.898641 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-nb\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:47 crc kubenswrapper[4780]: I0929 20:11:47.913886 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsf4c\" (UniqueName: \"kubernetes.io/projected/a0d67219-f8e3-4372-b3ca-dd003fe11375-kube-api-access-dsf4c\") pod \"dnsmasq-dns-7986d5d7c7-29fh6\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.035678 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.220377 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.319426 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7986d5d7c7-29fh6"] Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.325785 4780 generic.go:334] "Generic (PLEG): container finished" podID="8cdf2222-fb53-488c-91d2-71060756dce8" containerID="7a8ef653a5a4045c27645a3009eeafbba7bddbe9b8ea305649d8ecf020aa4eeb" exitCode=0 Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.325917 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" event={"ID":"8cdf2222-fb53-488c-91d2-71060756dce8","Type":"ContainerDied","Data":"7a8ef653a5a4045c27645a3009eeafbba7bddbe9b8ea305649d8ecf020aa4eeb"} Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.325979 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" event={"ID":"8cdf2222-fb53-488c-91d2-71060756dce8","Type":"ContainerStarted","Data":"dc91849fd39047fc1da85cb8eca797001cfbfe8c46f7deeb8303c302379d59e6"} Sep 29 20:11:48 crc kubenswrapper[4780]: W0929 20:11:48.326712 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0d67219_f8e3_4372_b3ca_dd003fe11375.slice/crio-49346dec9a4dfa7a4b8e02dd3c4f27ddb5019ad4360a12d41a49522b2641f54d WatchSource:0}: Error finding container 49346dec9a4dfa7a4b8e02dd3c4f27ddb5019ad4360a12d41a49522b2641f54d: Status 404 returned error can't find the container with id 49346dec9a4dfa7a4b8e02dd3c4f27ddb5019ad4360a12d41a49522b2641f54d Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.381135 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.564844 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.604440 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-config\") pod \"8cdf2222-fb53-488c-91d2-71060756dce8\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.604774 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-dns-svc\") pod \"8cdf2222-fb53-488c-91d2-71060756dce8\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.604815 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-ovsdbserver-nb\") pod \"8cdf2222-fb53-488c-91d2-71060756dce8\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.605001 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr7kh\" (UniqueName: \"kubernetes.io/projected/8cdf2222-fb53-488c-91d2-71060756dce8-kube-api-access-wr7kh\") pod \"8cdf2222-fb53-488c-91d2-71060756dce8\" (UID: \"8cdf2222-fb53-488c-91d2-71060756dce8\") " Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.609202 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cdf2222-fb53-488c-91d2-71060756dce8-kube-api-access-wr7kh" (OuterVolumeSpecName: "kube-api-access-wr7kh") pod "8cdf2222-fb53-488c-91d2-71060756dce8" (UID: "8cdf2222-fb53-488c-91d2-71060756dce8"). InnerVolumeSpecName "kube-api-access-wr7kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.610135 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr7kh\" (UniqueName: \"kubernetes.io/projected/8cdf2222-fb53-488c-91d2-71060756dce8-kube-api-access-wr7kh\") on node \"crc\" DevicePath \"\"" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.626137 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8cdf2222-fb53-488c-91d2-71060756dce8" (UID: "8cdf2222-fb53-488c-91d2-71060756dce8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.631725 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-config" (OuterVolumeSpecName: "config") pod "8cdf2222-fb53-488c-91d2-71060756dce8" (UID: "8cdf2222-fb53-488c-91d2-71060756dce8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.633587 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8cdf2222-fb53-488c-91d2-71060756dce8" (UID: "8cdf2222-fb53-488c-91d2-71060756dce8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.711718 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-config\") on node \"crc\" DevicePath \"\"" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.711751 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 20:11:48 crc kubenswrapper[4780]: I0929 20:11:48.711761 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdf2222-fb53-488c-91d2-71060756dce8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 20:11:49 crc kubenswrapper[4780]: I0929 20:11:49.337226 4780 generic.go:334] "Generic (PLEG): container finished" podID="a0d67219-f8e3-4372-b3ca-dd003fe11375" containerID="b877db94dcd31c3529579743d199f56e31bd63324ba265c6a7aecb462d66c9ad" exitCode=0 Sep 29 20:11:49 crc kubenswrapper[4780]: I0929 20:11:49.337329 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" event={"ID":"a0d67219-f8e3-4372-b3ca-dd003fe11375","Type":"ContainerDied","Data":"b877db94dcd31c3529579743d199f56e31bd63324ba265c6a7aecb462d66c9ad"} Sep 29 20:11:49 crc kubenswrapper[4780]: I0929 20:11:49.337406 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" event={"ID":"a0d67219-f8e3-4372-b3ca-dd003fe11375","Type":"ContainerStarted","Data":"49346dec9a4dfa7a4b8e02dd3c4f27ddb5019ad4360a12d41a49522b2641f54d"} Sep 29 20:11:49 crc kubenswrapper[4780]: I0929 20:11:49.339137 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" event={"ID":"8cdf2222-fb53-488c-91d2-71060756dce8","Type":"ContainerDied","Data":"dc91849fd39047fc1da85cb8eca797001cfbfe8c46f7deeb8303c302379d59e6"} Sep 29 20:11:49 crc kubenswrapper[4780]: I0929 20:11:49.339197 4780 scope.go:117] "RemoveContainer" containerID="7a8ef653a5a4045c27645a3009eeafbba7bddbe9b8ea305649d8ecf020aa4eeb" Sep 29 20:11:49 crc kubenswrapper[4780]: I0929 20:11:49.339218 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789b77bc97-7q5sn" Sep 29 20:11:49 crc kubenswrapper[4780]: I0929 20:11:49.446919 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-789b77bc97-7q5sn"] Sep 29 20:11:49 crc kubenswrapper[4780]: I0929 20:11:49.465462 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-789b77bc97-7q5sn"] Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.346420 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" event={"ID":"a0d67219-f8e3-4372-b3ca-dd003fe11375","Type":"ContainerStarted","Data":"e9ab707808f5be1285d3889954d76ee347f1b0c9307204680ca56cecb158310b"} Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.347802 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.363222 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" podStartSLOduration=3.36320146 podStartE2EDuration="3.36320146s" podCreationTimestamp="2025-09-29 20:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:11:50.360007389 +0000 UTC m=+5310.308305433" watchObservedRunningTime="2025-09-29 20:11:50.36320146 +0000 UTC m=+5310.311499504" Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.763946 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cdf2222-fb53-488c-91d2-71060756dce8" path="/var/lib/kubelet/pods/8cdf2222-fb53-488c-91d2-71060756dce8/volumes" Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.963649 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Sep 29 20:11:50 crc kubenswrapper[4780]: E0929 20:11:50.964195 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdf2222-fb53-488c-91d2-71060756dce8" containerName="init" Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.964225 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdf2222-fb53-488c-91d2-71060756dce8" containerName="init" Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.964485 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cdf2222-fb53-488c-91d2-71060756dce8" containerName="init" Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.966028 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.976710 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Sep 29 20:11:50 crc kubenswrapper[4780]: I0929 20:11:50.981512 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.051103 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6ddadfee-0076-4181-aeff-aace0c0ffb1f-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") " pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.051187 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-052930e5-62ab-4c78-a47c-45bc2c98628e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-052930e5-62ab-4c78-a47c-45bc2c98628e\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") " pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.051309 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzgzp\" (UniqueName: \"kubernetes.io/projected/6ddadfee-0076-4181-aeff-aace0c0ffb1f-kube-api-access-rzgzp\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") " pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.152893 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6ddadfee-0076-4181-aeff-aace0c0ffb1f-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") " pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.152939 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-052930e5-62ab-4c78-a47c-45bc2c98628e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-052930e5-62ab-4c78-a47c-45bc2c98628e\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") " pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.152974 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzgzp\" (UniqueName: \"kubernetes.io/projected/6ddadfee-0076-4181-aeff-aace0c0ffb1f-kube-api-access-rzgzp\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") " pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.156173 4780 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.156225 4780 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-052930e5-62ab-4c78-a47c-45bc2c98628e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-052930e5-62ab-4c78-a47c-45bc2c98628e\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a2946d834fb0d21f8e1c8555df15c9aa7faaef308b7845eb739b13d3c0e18416/globalmount\"" pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.160973 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6ddadfee-0076-4181-aeff-aace0c0ffb1f-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") " pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.172135 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzgzp\" (UniqueName: \"kubernetes.io/projected/6ddadfee-0076-4181-aeff-aace0c0ffb1f-kube-api-access-rzgzp\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") " pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.199728 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-052930e5-62ab-4c78-a47c-45bc2c98628e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-052930e5-62ab-4c78-a47c-45bc2c98628e\") pod \"ovn-copy-data\" (UID: \"6ddadfee-0076-4181-aeff-aace0c0ffb1f\") " pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.300338 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.639935 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Sep 29 20:11:51 crc kubenswrapper[4780]: W0929 20:11:51.648812 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ddadfee_0076_4181_aeff_aace0c0ffb1f.slice/crio-8649d571c91d748902131425fe9dca8fd6dbf218fa6ff60bda6de028843b25d3 WatchSource:0}: Error finding container 8649d571c91d748902131425fe9dca8fd6dbf218fa6ff60bda6de028843b25d3: Status 404 returned error can't find the container with id 8649d571c91d748902131425fe9dca8fd6dbf218fa6ff60bda6de028843b25d3 Sep 29 20:11:51 crc kubenswrapper[4780]: I0929 20:11:51.651949 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 20:11:52 crc kubenswrapper[4780]: I0929 20:11:52.377663 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"6ddadfee-0076-4181-aeff-aace0c0ffb1f","Type":"ContainerStarted","Data":"8649d571c91d748902131425fe9dca8fd6dbf218fa6ff60bda6de028843b25d3"} Sep 29 20:11:53 crc kubenswrapper[4780]: I0929 20:11:53.388266 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"6ddadfee-0076-4181-aeff-aace0c0ffb1f","Type":"ContainerStarted","Data":"8e42725822309061f8d2d63e87a5d656c6c480dbedd91ceddfa48f524d8a875d"} Sep 29 20:11:53 crc kubenswrapper[4780]: I0929 20:11:53.411499 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.791311867 podStartE2EDuration="4.411480668s" podCreationTimestamp="2025-09-29 20:11:49 +0000 UTC" firstStartedPulling="2025-09-29 20:11:51.651532036 +0000 UTC m=+5311.599830090" lastFinishedPulling="2025-09-29 20:11:52.271700847 +0000 UTC m=+5312.219998891" observedRunningTime="2025-09-29 20:11:53.408935245 +0000 UTC m=+5313.357233309" watchObservedRunningTime="2025-09-29 20:11:53.411480668 +0000 UTC m=+5313.359778702" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.037632 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.114412 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-t8smt"] Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.114637 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" podUID="c925558e-f918-4b3c-be41-38f20faeefea" containerName="dnsmasq-dns" containerID="cri-o://a63c54af105e15fbbff097c399930b624f738987943240c6f830e15b3a8baa36" gracePeriod=10 Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.436202 4780 generic.go:334] "Generic (PLEG): container finished" podID="c925558e-f918-4b3c-be41-38f20faeefea" containerID="a63c54af105e15fbbff097c399930b624f738987943240c6f830e15b3a8baa36" exitCode=0 Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.436279 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" event={"ID":"c925558e-f918-4b3c-be41-38f20faeefea","Type":"ContainerDied","Data":"a63c54af105e15fbbff097c399930b624f738987943240c6f830e15b3a8baa36"} Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.570312 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.571545 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.575643 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.575921 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.576083 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.576227 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-kg6ld" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.600757 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.619417 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.696234 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.696565 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-scripts\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.696669 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4tq9\" (UniqueName: \"kubernetes.io/projected/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-kube-api-access-k4tq9\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.696914 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-config\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.697129 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.697236 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.697266 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.798599 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-config\") pod \"c925558e-f918-4b3c-be41-38f20faeefea\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.798683 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-dns-svc\") pod \"c925558e-f918-4b3c-be41-38f20faeefea\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.798703 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjdvf\" (UniqueName: \"kubernetes.io/projected/c925558e-f918-4b3c-be41-38f20faeefea-kube-api-access-mjdvf\") pod \"c925558e-f918-4b3c-be41-38f20faeefea\" (UID: \"c925558e-f918-4b3c-be41-38f20faeefea\") " Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.798940 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.798967 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.799000 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-scripts\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.799017 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4tq9\" (UniqueName: \"kubernetes.io/projected/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-kube-api-access-k4tq9\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.799100 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-config\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.799153 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.799193 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.799906 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.800424 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-config\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.800802 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-scripts\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.805631 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.809327 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c925558e-f918-4b3c-be41-38f20faeefea-kube-api-access-mjdvf" (OuterVolumeSpecName: "kube-api-access-mjdvf") pod "c925558e-f918-4b3c-be41-38f20faeefea" (UID: "c925558e-f918-4b3c-be41-38f20faeefea"). InnerVolumeSpecName "kube-api-access-mjdvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.809925 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.816329 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4tq9\" (UniqueName: \"kubernetes.io/projected/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-kube-api-access-k4tq9\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.821569 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb0ef5f-8130-43d6-b9b5-fe5c480f842b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b\") " pod="openstack/ovn-northd-0" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.857093 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-config" (OuterVolumeSpecName: "config") pod "c925558e-f918-4b3c-be41-38f20faeefea" (UID: "c925558e-f918-4b3c-be41-38f20faeefea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.861602 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c925558e-f918-4b3c-be41-38f20faeefea" (UID: "c925558e-f918-4b3c-be41-38f20faeefea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.900845 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-config\") on node \"crc\" DevicePath \"\"" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.900879 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c925558e-f918-4b3c-be41-38f20faeefea-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.900896 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjdvf\" (UniqueName: \"kubernetes.io/projected/c925558e-f918-4b3c-be41-38f20faeefea-kube-api-access-mjdvf\") on node \"crc\" DevicePath \"\"" Sep 29 20:11:58 crc kubenswrapper[4780]: I0929 20:11:58.927330 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 29 20:11:59 crc kubenswrapper[4780]: I0929 20:11:59.370533 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 29 20:11:59 crc kubenswrapper[4780]: W0929 20:11:59.386185 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fb0ef5f_8130_43d6_b9b5_fe5c480f842b.slice/crio-2e2db95b512ab59064831cb1ff8c0a3c08ceefb2d66c934ef51f7c173c81df33 WatchSource:0}: Error finding container 2e2db95b512ab59064831cb1ff8c0a3c08ceefb2d66c934ef51f7c173c81df33: Status 404 returned error can't find the container with id 2e2db95b512ab59064831cb1ff8c0a3c08ceefb2d66c934ef51f7c173c81df33 Sep 29 20:11:59 crc kubenswrapper[4780]: I0929 20:11:59.459876 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b","Type":"ContainerStarted","Data":"2e2db95b512ab59064831cb1ff8c0a3c08ceefb2d66c934ef51f7c173c81df33"} Sep 29 20:11:59 crc kubenswrapper[4780]: I0929 20:11:59.462475 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" event={"ID":"c925558e-f918-4b3c-be41-38f20faeefea","Type":"ContainerDied","Data":"15aa768be84ef30b9a7cb9367176e0fed1d7b6540bba3eef00629d400a41f323"} Sep 29 20:11:59 crc kubenswrapper[4780]: I0929 20:11:59.462565 4780 scope.go:117] "RemoveContainer" containerID="a63c54af105e15fbbff097c399930b624f738987943240c6f830e15b3a8baa36" Sep 29 20:11:59 crc kubenswrapper[4780]: I0929 20:11:59.462799 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" Sep 29 20:11:59 crc kubenswrapper[4780]: I0929 20:11:59.499916 4780 scope.go:117] "RemoveContainer" containerID="8783a37f331e28ea26f34b89916d013f5d27ab71e128f5f7e9fa1b3ef37158eb" Sep 29 20:11:59 crc kubenswrapper[4780]: I0929 20:11:59.544616 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-t8smt"] Sep 29 20:11:59 crc kubenswrapper[4780]: I0929 20:11:59.553276 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6885566dd9-t8smt"] Sep 29 20:12:00 crc kubenswrapper[4780]: I0929 20:12:00.475134 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b","Type":"ContainerStarted","Data":"b5fefd60898ed32816f4d7a50de9a2a233f6bf7bf7d24e0210daf7ee3108e724"} Sep 29 20:12:00 crc kubenswrapper[4780]: I0929 20:12:00.475511 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6fb0ef5f-8130-43d6-b9b5-fe5c480f842b","Type":"ContainerStarted","Data":"a981cadf4f7099717479e99b1f506c0e863c9d686ccd644213a3bf4a51fafdf2"} Sep 29 20:12:00 crc kubenswrapper[4780]: I0929 20:12:00.475539 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 29 20:12:00 crc kubenswrapper[4780]: I0929 20:12:00.506443 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.506413325 podStartE2EDuration="2.506413325s" podCreationTimestamp="2025-09-29 20:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:12:00.503017057 +0000 UTC m=+5320.451315121" watchObservedRunningTime="2025-09-29 20:12:00.506413325 +0000 UTC m=+5320.454711379" Sep 29 20:12:00 crc kubenswrapper[4780]: I0929 20:12:00.773732 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c925558e-f918-4b3c-be41-38f20faeefea" path="/var/lib/kubelet/pods/c925558e-f918-4b3c-be41-38f20faeefea/volumes" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.464504 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6885566dd9-t8smt" podUID="c925558e-f918-4b3c-be41-38f20faeefea" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.246:5353: i/o timeout" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.573184 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qcgmn"] Sep 29 20:12:03 crc kubenswrapper[4780]: E0929 20:12:03.573825 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c925558e-f918-4b3c-be41-38f20faeefea" containerName="dnsmasq-dns" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.573931 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c925558e-f918-4b3c-be41-38f20faeefea" containerName="dnsmasq-dns" Sep 29 20:12:03 crc kubenswrapper[4780]: E0929 20:12:03.574028 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c925558e-f918-4b3c-be41-38f20faeefea" containerName="init" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.574132 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="c925558e-f918-4b3c-be41-38f20faeefea" containerName="init" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.574399 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="c925558e-f918-4b3c-be41-38f20faeefea" containerName="dnsmasq-dns" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.575180 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qcgmn" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.585215 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qcgmn"] Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.684532 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdzpx\" (UniqueName: \"kubernetes.io/projected/5c0791ee-7613-491e-940b-b4810ca0be6f-kube-api-access-jdzpx\") pod \"keystone-db-create-qcgmn\" (UID: \"5c0791ee-7613-491e-940b-b4810ca0be6f\") " pod="openstack/keystone-db-create-qcgmn" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.787302 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdzpx\" (UniqueName: \"kubernetes.io/projected/5c0791ee-7613-491e-940b-b4810ca0be6f-kube-api-access-jdzpx\") pod \"keystone-db-create-qcgmn\" (UID: \"5c0791ee-7613-491e-940b-b4810ca0be6f\") " pod="openstack/keystone-db-create-qcgmn" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.815630 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdzpx\" (UniqueName: \"kubernetes.io/projected/5c0791ee-7613-491e-940b-b4810ca0be6f-kube-api-access-jdzpx\") pod \"keystone-db-create-qcgmn\" (UID: \"5c0791ee-7613-491e-940b-b4810ca0be6f\") " pod="openstack/keystone-db-create-qcgmn" Sep 29 20:12:03 crc kubenswrapper[4780]: I0929 20:12:03.908993 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qcgmn" Sep 29 20:12:04 crc kubenswrapper[4780]: I0929 20:12:04.377384 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qcgmn"] Sep 29 20:12:04 crc kubenswrapper[4780]: W0929 20:12:04.386557 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c0791ee_7613_491e_940b_b4810ca0be6f.slice/crio-15982a8858bf4825f9bc4579152cea4e7d58a239903734e6d3c64f165378f9ee WatchSource:0}: Error finding container 15982a8858bf4825f9bc4579152cea4e7d58a239903734e6d3c64f165378f9ee: Status 404 returned error can't find the container with id 15982a8858bf4825f9bc4579152cea4e7d58a239903734e6d3c64f165378f9ee Sep 29 20:12:04 crc kubenswrapper[4780]: I0929 20:12:04.517016 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qcgmn" event={"ID":"5c0791ee-7613-491e-940b-b4810ca0be6f","Type":"ContainerStarted","Data":"15982a8858bf4825f9bc4579152cea4e7d58a239903734e6d3c64f165378f9ee"} Sep 29 20:12:05 crc kubenswrapper[4780]: I0929 20:12:05.529806 4780 generic.go:334] "Generic (PLEG): container finished" podID="5c0791ee-7613-491e-940b-b4810ca0be6f" containerID="a752e921fd55c6c7c14c327624205ceef7d53986b2bec2563f9972ac2a019452" exitCode=0 Sep 29 20:12:05 crc kubenswrapper[4780]: I0929 20:12:05.529892 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qcgmn" event={"ID":"5c0791ee-7613-491e-940b-b4810ca0be6f","Type":"ContainerDied","Data":"a752e921fd55c6c7c14c327624205ceef7d53986b2bec2563f9972ac2a019452"} Sep 29 20:12:06 crc kubenswrapper[4780]: I0929 20:12:06.978248 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qcgmn" Sep 29 20:12:07 crc kubenswrapper[4780]: I0929 20:12:07.143197 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdzpx\" (UniqueName: \"kubernetes.io/projected/5c0791ee-7613-491e-940b-b4810ca0be6f-kube-api-access-jdzpx\") pod \"5c0791ee-7613-491e-940b-b4810ca0be6f\" (UID: \"5c0791ee-7613-491e-940b-b4810ca0be6f\") " Sep 29 20:12:07 crc kubenswrapper[4780]: I0929 20:12:07.152930 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0791ee-7613-491e-940b-b4810ca0be6f-kube-api-access-jdzpx" (OuterVolumeSpecName: "kube-api-access-jdzpx") pod "5c0791ee-7613-491e-940b-b4810ca0be6f" (UID: "5c0791ee-7613-491e-940b-b4810ca0be6f"). InnerVolumeSpecName "kube-api-access-jdzpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:12:07 crc kubenswrapper[4780]: I0929 20:12:07.248746 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdzpx\" (UniqueName: \"kubernetes.io/projected/5c0791ee-7613-491e-940b-b4810ca0be6f-kube-api-access-jdzpx\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:07 crc kubenswrapper[4780]: I0929 20:12:07.552201 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qcgmn" event={"ID":"5c0791ee-7613-491e-940b-b4810ca0be6f","Type":"ContainerDied","Data":"15982a8858bf4825f9bc4579152cea4e7d58a239903734e6d3c64f165378f9ee"} Sep 29 20:12:07 crc kubenswrapper[4780]: I0929 20:12:07.552251 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15982a8858bf4825f9bc4579152cea4e7d58a239903734e6d3c64f165378f9ee" Sep 29 20:12:07 crc kubenswrapper[4780]: I0929 20:12:07.552280 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qcgmn" Sep 29 20:12:13 crc kubenswrapper[4780]: I0929 20:12:13.699993 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c775-account-create-qq5ml"] Sep 29 20:12:13 crc kubenswrapper[4780]: E0929 20:12:13.701226 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0791ee-7613-491e-940b-b4810ca0be6f" containerName="mariadb-database-create" Sep 29 20:12:13 crc kubenswrapper[4780]: I0929 20:12:13.701248 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0791ee-7613-491e-940b-b4810ca0be6f" containerName="mariadb-database-create" Sep 29 20:12:13 crc kubenswrapper[4780]: I0929 20:12:13.701543 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0791ee-7613-491e-940b-b4810ca0be6f" containerName="mariadb-database-create" Sep 29 20:12:13 crc kubenswrapper[4780]: I0929 20:12:13.703150 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c775-account-create-qq5ml" Sep 29 20:12:13 crc kubenswrapper[4780]: I0929 20:12:13.706197 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 29 20:12:13 crc kubenswrapper[4780]: I0929 20:12:13.715029 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c775-account-create-qq5ml"] Sep 29 20:12:13 crc kubenswrapper[4780]: I0929 20:12:13.882107 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4r9s\" (UniqueName: \"kubernetes.io/projected/b8cb3bd4-36b7-4431-892d-3972efabd631-kube-api-access-s4r9s\") pod \"keystone-c775-account-create-qq5ml\" (UID: \"b8cb3bd4-36b7-4431-892d-3972efabd631\") " pod="openstack/keystone-c775-account-create-qq5ml" Sep 29 20:12:13 crc kubenswrapper[4780]: I0929 20:12:13.984331 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4r9s\" (UniqueName: \"kubernetes.io/projected/b8cb3bd4-36b7-4431-892d-3972efabd631-kube-api-access-s4r9s\") pod \"keystone-c775-account-create-qq5ml\" (UID: \"b8cb3bd4-36b7-4431-892d-3972efabd631\") " pod="openstack/keystone-c775-account-create-qq5ml" Sep 29 20:12:14 crc kubenswrapper[4780]: I0929 20:12:14.002770 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 29 20:12:14 crc kubenswrapper[4780]: I0929 20:12:14.006716 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4r9s\" (UniqueName: \"kubernetes.io/projected/b8cb3bd4-36b7-4431-892d-3972efabd631-kube-api-access-s4r9s\") pod \"keystone-c775-account-create-qq5ml\" (UID: \"b8cb3bd4-36b7-4431-892d-3972efabd631\") " pod="openstack/keystone-c775-account-create-qq5ml" Sep 29 20:12:14 crc kubenswrapper[4780]: I0929 20:12:14.038725 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c775-account-create-qq5ml" Sep 29 20:12:14 crc kubenswrapper[4780]: I0929 20:12:14.502008 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c775-account-create-qq5ml"] Sep 29 20:12:14 crc kubenswrapper[4780]: I0929 20:12:14.630614 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c775-account-create-qq5ml" event={"ID":"b8cb3bd4-36b7-4431-892d-3972efabd631","Type":"ContainerStarted","Data":"5a7d699c4be3ad9c4c9c6334a039a5ff7f08a22604ae4a68c9f7a4f103dd5f29"} Sep 29 20:12:15 crc kubenswrapper[4780]: I0929 20:12:15.644367 4780 generic.go:334] "Generic (PLEG): container finished" podID="b8cb3bd4-36b7-4431-892d-3972efabd631" containerID="7bc66ddbe84d4fd38f275b188b9203df952208ff9cdbeb990a3cb91f61087ba1" exitCode=0 Sep 29 20:12:15 crc kubenswrapper[4780]: I0929 20:12:15.644494 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c775-account-create-qq5ml" event={"ID":"b8cb3bd4-36b7-4431-892d-3972efabd631","Type":"ContainerDied","Data":"7bc66ddbe84d4fd38f275b188b9203df952208ff9cdbeb990a3cb91f61087ba1"} Sep 29 20:12:17 crc kubenswrapper[4780]: I0929 20:12:17.089753 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c775-account-create-qq5ml" Sep 29 20:12:17 crc kubenswrapper[4780]: I0929 20:12:17.256219 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4r9s\" (UniqueName: \"kubernetes.io/projected/b8cb3bd4-36b7-4431-892d-3972efabd631-kube-api-access-s4r9s\") pod \"b8cb3bd4-36b7-4431-892d-3972efabd631\" (UID: \"b8cb3bd4-36b7-4431-892d-3972efabd631\") " Sep 29 20:12:17 crc kubenswrapper[4780]: I0929 20:12:17.262423 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8cb3bd4-36b7-4431-892d-3972efabd631-kube-api-access-s4r9s" (OuterVolumeSpecName: "kube-api-access-s4r9s") pod "b8cb3bd4-36b7-4431-892d-3972efabd631" (UID: "b8cb3bd4-36b7-4431-892d-3972efabd631"). InnerVolumeSpecName "kube-api-access-s4r9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:12:17 crc kubenswrapper[4780]: I0929 20:12:17.359825 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4r9s\" (UniqueName: \"kubernetes.io/projected/b8cb3bd4-36b7-4431-892d-3972efabd631-kube-api-access-s4r9s\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:17 crc kubenswrapper[4780]: I0929 20:12:17.671825 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c775-account-create-qq5ml" event={"ID":"b8cb3bd4-36b7-4431-892d-3972efabd631","Type":"ContainerDied","Data":"5a7d699c4be3ad9c4c9c6334a039a5ff7f08a22604ae4a68c9f7a4f103dd5f29"} Sep 29 20:12:17 crc kubenswrapper[4780]: I0929 20:12:17.672237 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a7d699c4be3ad9c4c9c6334a039a5ff7f08a22604ae4a68c9f7a4f103dd5f29" Sep 29 20:12:17 crc kubenswrapper[4780]: I0929 20:12:17.671907 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c775-account-create-qq5ml" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.201417 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xfsht"] Sep 29 20:12:19 crc kubenswrapper[4780]: E0929 20:12:19.201769 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8cb3bd4-36b7-4431-892d-3972efabd631" containerName="mariadb-account-create" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.201785 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8cb3bd4-36b7-4431-892d-3972efabd631" containerName="mariadb-account-create" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.202005 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8cb3bd4-36b7-4431-892d-3972efabd631" containerName="mariadb-account-create" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.202623 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.206394 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.207436 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.207738 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mxsds" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.208102 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.223905 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xfsht"] Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.400186 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-config-data\") pod \"keystone-db-sync-xfsht\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.400275 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzjlw\" (UniqueName: \"kubernetes.io/projected/79032b7c-f044-406a-953d-2378bbb62e8b-kube-api-access-mzjlw\") pod \"keystone-db-sync-xfsht\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.400361 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-combined-ca-bundle\") pod \"keystone-db-sync-xfsht\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.502341 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-combined-ca-bundle\") pod \"keystone-db-sync-xfsht\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.502568 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-config-data\") pod \"keystone-db-sync-xfsht\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.502620 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzjlw\" (UniqueName: \"kubernetes.io/projected/79032b7c-f044-406a-953d-2378bbb62e8b-kube-api-access-mzjlw\") pod \"keystone-db-sync-xfsht\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.508476 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-config-data\") pod \"keystone-db-sync-xfsht\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.511736 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-combined-ca-bundle\") pod \"keystone-db-sync-xfsht\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.522493 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzjlw\" (UniqueName: \"kubernetes.io/projected/79032b7c-f044-406a-953d-2378bbb62e8b-kube-api-access-mzjlw\") pod \"keystone-db-sync-xfsht\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:19 crc kubenswrapper[4780]: I0929 20:12:19.821212 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:20 crc kubenswrapper[4780]: I0929 20:12:20.373210 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xfsht"] Sep 29 20:12:20 crc kubenswrapper[4780]: W0929 20:12:20.381826 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79032b7c_f044_406a_953d_2378bbb62e8b.slice/crio-f53574a7a73ef8780a79835f7e20f658a8b976655f0408f094046f98a670febb WatchSource:0}: Error finding container f53574a7a73ef8780a79835f7e20f658a8b976655f0408f094046f98a670febb: Status 404 returned error can't find the container with id f53574a7a73ef8780a79835f7e20f658a8b976655f0408f094046f98a670febb Sep 29 20:12:20 crc kubenswrapper[4780]: I0929 20:12:20.707687 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xfsht" event={"ID":"79032b7c-f044-406a-953d-2378bbb62e8b","Type":"ContainerStarted","Data":"6d28cb1bc0f44467fa1b9472b8075aa5eb0666a72a4033d05d97e8a7fc24f760"} Sep 29 20:12:20 crc kubenswrapper[4780]: I0929 20:12:20.707742 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xfsht" event={"ID":"79032b7c-f044-406a-953d-2378bbb62e8b","Type":"ContainerStarted","Data":"f53574a7a73ef8780a79835f7e20f658a8b976655f0408f094046f98a670febb"} Sep 29 20:12:20 crc kubenswrapper[4780]: I0929 20:12:20.736119 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xfsht" podStartSLOduration=1.736100953 podStartE2EDuration="1.736100953s" podCreationTimestamp="2025-09-29 20:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:12:20.730318857 +0000 UTC m=+5340.678616941" watchObservedRunningTime="2025-09-29 20:12:20.736100953 +0000 UTC m=+5340.684399007" Sep 29 20:12:22 crc kubenswrapper[4780]: I0929 20:12:22.733642 4780 generic.go:334] "Generic (PLEG): container finished" podID="79032b7c-f044-406a-953d-2378bbb62e8b" containerID="6d28cb1bc0f44467fa1b9472b8075aa5eb0666a72a4033d05d97e8a7fc24f760" exitCode=0 Sep 29 20:12:22 crc kubenswrapper[4780]: I0929 20:12:22.733701 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xfsht" event={"ID":"79032b7c-f044-406a-953d-2378bbb62e8b","Type":"ContainerDied","Data":"6d28cb1bc0f44467fa1b9472b8075aa5eb0666a72a4033d05d97e8a7fc24f760"} Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.164605 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.197704 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzjlw\" (UniqueName: \"kubernetes.io/projected/79032b7c-f044-406a-953d-2378bbb62e8b-kube-api-access-mzjlw\") pod \"79032b7c-f044-406a-953d-2378bbb62e8b\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.197770 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-config-data\") pod \"79032b7c-f044-406a-953d-2378bbb62e8b\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.197804 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-combined-ca-bundle\") pod \"79032b7c-f044-406a-953d-2378bbb62e8b\" (UID: \"79032b7c-f044-406a-953d-2378bbb62e8b\") " Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.205129 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79032b7c-f044-406a-953d-2378bbb62e8b-kube-api-access-mzjlw" (OuterVolumeSpecName: "kube-api-access-mzjlw") pod "79032b7c-f044-406a-953d-2378bbb62e8b" (UID: "79032b7c-f044-406a-953d-2378bbb62e8b"). InnerVolumeSpecName "kube-api-access-mzjlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.237628 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79032b7c-f044-406a-953d-2378bbb62e8b" (UID: "79032b7c-f044-406a-953d-2378bbb62e8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.246605 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-config-data" (OuterVolumeSpecName: "config-data") pod "79032b7c-f044-406a-953d-2378bbb62e8b" (UID: "79032b7c-f044-406a-953d-2378bbb62e8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.299336 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.299385 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzjlw\" (UniqueName: \"kubernetes.io/projected/79032b7c-f044-406a-953d-2378bbb62e8b-kube-api-access-mzjlw\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.299398 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79032b7c-f044-406a-953d-2378bbb62e8b-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.772457 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xfsht" event={"ID":"79032b7c-f044-406a-953d-2378bbb62e8b","Type":"ContainerDied","Data":"f53574a7a73ef8780a79835f7e20f658a8b976655f0408f094046f98a670febb"} Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.772514 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f53574a7a73ef8780a79835f7e20f658a8b976655f0408f094046f98a670febb" Sep 29 20:12:24 crc kubenswrapper[4780]: I0929 20:12:24.772537 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xfsht" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.063263 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ntcvv"] Sep 29 20:12:25 crc kubenswrapper[4780]: E0929 20:12:25.063771 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79032b7c-f044-406a-953d-2378bbb62e8b" containerName="keystone-db-sync" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.063786 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="79032b7c-f044-406a-953d-2378bbb62e8b" containerName="keystone-db-sync" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.063992 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="79032b7c-f044-406a-953d-2378bbb62e8b" containerName="keystone-db-sync" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.064723 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.068224 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.068501 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mxsds" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.068503 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.068663 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.075733 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bcf577bd5-j6nqg"] Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.077503 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.085226 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bcf577bd5-j6nqg"] Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.108621 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ntcvv"] Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124076 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-config-data\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124132 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-dns-svc\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124179 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkgnp\" (UniqueName: \"kubernetes.io/projected/34a6eac5-da6b-40fa-a11f-42301b421306-kube-api-access-dkgnp\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124231 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-ovsdbserver-nb\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124324 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-ovsdbserver-sb\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124348 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-combined-ca-bundle\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124375 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-credential-keys\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124393 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-fernet-keys\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124420 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-scripts\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124445 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjkh8\" (UniqueName: \"kubernetes.io/projected/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-kube-api-access-kjkh8\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.124479 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-config\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.226838 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-credential-keys\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.228274 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-fernet-keys\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.228397 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-scripts\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.228511 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjkh8\" (UniqueName: \"kubernetes.io/projected/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-kube-api-access-kjkh8\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.228641 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-config\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.228798 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-config-data\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.228899 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-dns-svc\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.228991 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkgnp\" (UniqueName: \"kubernetes.io/projected/34a6eac5-da6b-40fa-a11f-42301b421306-kube-api-access-dkgnp\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.229109 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-ovsdbserver-nb\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.229232 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-ovsdbserver-sb\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.229323 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-combined-ca-bundle\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.229581 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-config\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.229691 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-dns-svc\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.230304 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-ovsdbserver-nb\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.230378 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34a6eac5-da6b-40fa-a11f-42301b421306-ovsdbserver-sb\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.235715 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-scripts\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.235924 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-credential-keys\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.236062 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-fernet-keys\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.236615 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-config-data\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.242881 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-combined-ca-bundle\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.245451 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjkh8\" (UniqueName: \"kubernetes.io/projected/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-kube-api-access-kjkh8\") pod \"keystone-bootstrap-ntcvv\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.249065 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkgnp\" (UniqueName: \"kubernetes.io/projected/34a6eac5-da6b-40fa-a11f-42301b421306-kube-api-access-dkgnp\") pod \"dnsmasq-dns-bcf577bd5-j6nqg\" (UID: \"34a6eac5-da6b-40fa-a11f-42301b421306\") " pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.389963 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.406561 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.694337 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bcf577bd5-j6nqg"] Sep 29 20:12:25 crc kubenswrapper[4780]: W0929 20:12:25.705138 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a6eac5_da6b_40fa_a11f_42301b421306.slice/crio-a5d72dc5237325f9c63e9a6e82e3448d80246c181b270748e46c36448a548f56 WatchSource:0}: Error finding container a5d72dc5237325f9c63e9a6e82e3448d80246c181b270748e46c36448a548f56: Status 404 returned error can't find the container with id a5d72dc5237325f9c63e9a6e82e3448d80246c181b270748e46c36448a548f56 Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.787361 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" event={"ID":"34a6eac5-da6b-40fa-a11f-42301b421306","Type":"ContainerStarted","Data":"a5d72dc5237325f9c63e9a6e82e3448d80246c181b270748e46c36448a548f56"} Sep 29 20:12:25 crc kubenswrapper[4780]: I0929 20:12:25.844632 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ntcvv"] Sep 29 20:12:25 crc kubenswrapper[4780]: W0929 20:12:25.854068 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd07fdb46_4e94_42d7_a846_9cbf23dfc29f.slice/crio-07f7e3078210cd80e52104b2b33c5998a0923275fd3cf6ef7ac69a8a0b9b6dff WatchSource:0}: Error finding container 07f7e3078210cd80e52104b2b33c5998a0923275fd3cf6ef7ac69a8a0b9b6dff: Status 404 returned error can't find the container with id 07f7e3078210cd80e52104b2b33c5998a0923275fd3cf6ef7ac69a8a0b9b6dff Sep 29 20:12:26 crc kubenswrapper[4780]: I0929 20:12:26.800471 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ntcvv" event={"ID":"d07fdb46-4e94-42d7-a846-9cbf23dfc29f","Type":"ContainerStarted","Data":"ee6742837406809c4437c243810da7ad1b7b51048e989b6ee08b5648ffdfff5f"} Sep 29 20:12:26 crc kubenswrapper[4780]: I0929 20:12:26.800860 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ntcvv" event={"ID":"d07fdb46-4e94-42d7-a846-9cbf23dfc29f","Type":"ContainerStarted","Data":"07f7e3078210cd80e52104b2b33c5998a0923275fd3cf6ef7ac69a8a0b9b6dff"} Sep 29 20:12:26 crc kubenswrapper[4780]: I0929 20:12:26.812823 4780 generic.go:334] "Generic (PLEG): container finished" podID="34a6eac5-da6b-40fa-a11f-42301b421306" containerID="36101218ce95fdeb44d08243b56c878df05c052f7513e9f1c10fb0adbb86192d" exitCode=0 Sep 29 20:12:26 crc kubenswrapper[4780]: I0929 20:12:26.812880 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" event={"ID":"34a6eac5-da6b-40fa-a11f-42301b421306","Type":"ContainerDied","Data":"36101218ce95fdeb44d08243b56c878df05c052f7513e9f1c10fb0adbb86192d"} Sep 29 20:12:26 crc kubenswrapper[4780]: I0929 20:12:26.846114 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ntcvv" podStartSLOduration=1.8460921030000002 podStartE2EDuration="1.846092103s" podCreationTimestamp="2025-09-29 20:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:12:26.831663929 +0000 UTC m=+5346.779961983" watchObservedRunningTime="2025-09-29 20:12:26.846092103 +0000 UTC m=+5346.794390147" Sep 29 20:12:27 crc kubenswrapper[4780]: I0929 20:12:27.667555 4780 scope.go:117] "RemoveContainer" containerID="6856f336587debb5533c741a442609522bdbc048e7eb5bd28b4078f0298d8be2" Sep 29 20:12:27 crc kubenswrapper[4780]: I0929 20:12:27.823242 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" event={"ID":"34a6eac5-da6b-40fa-a11f-42301b421306","Type":"ContainerStarted","Data":"e0bd04ec610224249d2d431559c9298b81eaf783a25687261f190385a9ccc463"} Sep 29 20:12:27 crc kubenswrapper[4780]: I0929 20:12:27.823383 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:27 crc kubenswrapper[4780]: I0929 20:12:27.860933 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" podStartSLOduration=2.860904655 podStartE2EDuration="2.860904655s" podCreationTimestamp="2025-09-29 20:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:12:27.858540768 +0000 UTC m=+5347.806838922" watchObservedRunningTime="2025-09-29 20:12:27.860904655 +0000 UTC m=+5347.809202739" Sep 29 20:12:29 crc kubenswrapper[4780]: I0929 20:12:29.846400 4780 generic.go:334] "Generic (PLEG): container finished" podID="d07fdb46-4e94-42d7-a846-9cbf23dfc29f" containerID="ee6742837406809c4437c243810da7ad1b7b51048e989b6ee08b5648ffdfff5f" exitCode=0 Sep 29 20:12:29 crc kubenswrapper[4780]: I0929 20:12:29.846517 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ntcvv" event={"ID":"d07fdb46-4e94-42d7-a846-9cbf23dfc29f","Type":"ContainerDied","Data":"ee6742837406809c4437c243810da7ad1b7b51048e989b6ee08b5648ffdfff5f"} Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.306090 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.363015 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-fernet-keys\") pod \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.363411 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-credential-keys\") pod \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.363435 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-scripts\") pod \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.363460 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-combined-ca-bundle\") pod \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.363539 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjkh8\" (UniqueName: \"kubernetes.io/projected/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-kube-api-access-kjkh8\") pod \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.363564 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-config-data\") pod \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\" (UID: \"d07fdb46-4e94-42d7-a846-9cbf23dfc29f\") " Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.369522 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d07fdb46-4e94-42d7-a846-9cbf23dfc29f" (UID: "d07fdb46-4e94-42d7-a846-9cbf23dfc29f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.370649 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-scripts" (OuterVolumeSpecName: "scripts") pod "d07fdb46-4e94-42d7-a846-9cbf23dfc29f" (UID: "d07fdb46-4e94-42d7-a846-9cbf23dfc29f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.378260 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-kube-api-access-kjkh8" (OuterVolumeSpecName: "kube-api-access-kjkh8") pod "d07fdb46-4e94-42d7-a846-9cbf23dfc29f" (UID: "d07fdb46-4e94-42d7-a846-9cbf23dfc29f"). InnerVolumeSpecName "kube-api-access-kjkh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.379902 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d07fdb46-4e94-42d7-a846-9cbf23dfc29f" (UID: "d07fdb46-4e94-42d7-a846-9cbf23dfc29f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.397965 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d07fdb46-4e94-42d7-a846-9cbf23dfc29f" (UID: "d07fdb46-4e94-42d7-a846-9cbf23dfc29f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.410334 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-config-data" (OuterVolumeSpecName: "config-data") pod "d07fdb46-4e94-42d7-a846-9cbf23dfc29f" (UID: "d07fdb46-4e94-42d7-a846-9cbf23dfc29f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.465922 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjkh8\" (UniqueName: \"kubernetes.io/projected/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-kube-api-access-kjkh8\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.465971 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.465990 4780 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.466010 4780 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.466027 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.466066 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07fdb46-4e94-42d7-a846-9cbf23dfc29f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.874192 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ntcvv" event={"ID":"d07fdb46-4e94-42d7-a846-9cbf23dfc29f","Type":"ContainerDied","Data":"07f7e3078210cd80e52104b2b33c5998a0923275fd3cf6ef7ac69a8a0b9b6dff"} Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.874272 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07f7e3078210cd80e52104b2b33c5998a0923275fd3cf6ef7ac69a8a0b9b6dff" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.874314 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ntcvv" Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.979082 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ntcvv"] Sep 29 20:12:31 crc kubenswrapper[4780]: I0929 20:12:31.986862 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ntcvv"] Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.066288 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rm2d7"] Sep 29 20:12:32 crc kubenswrapper[4780]: E0929 20:12:32.066814 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07fdb46-4e94-42d7-a846-9cbf23dfc29f" containerName="keystone-bootstrap" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.066841 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07fdb46-4e94-42d7-a846-9cbf23dfc29f" containerName="keystone-bootstrap" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.067192 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="d07fdb46-4e94-42d7-a846-9cbf23dfc29f" containerName="keystone-bootstrap" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.068098 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.071735 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.071744 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mxsds" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.077035 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.077164 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.080018 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-combined-ca-bundle\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.080139 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbjgd\" (UniqueName: \"kubernetes.io/projected/ac0af9de-1686-4d95-ba41-9be33a2eef83-kube-api-access-qbjgd\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.080230 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-config-data\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.080309 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-credential-keys\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.080381 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-fernet-keys\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.080542 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-scripts\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.123910 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rm2d7"] Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.182244 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-combined-ca-bundle\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.182313 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbjgd\" (UniqueName: \"kubernetes.io/projected/ac0af9de-1686-4d95-ba41-9be33a2eef83-kube-api-access-qbjgd\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.182366 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-config-data\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.182417 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-credential-keys\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.182447 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-fernet-keys\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.182547 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-scripts\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.189002 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-scripts\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.189319 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-combined-ca-bundle\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.189543 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-credential-keys\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.190448 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-fernet-keys\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.190665 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-config-data\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.205729 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbjgd\" (UniqueName: \"kubernetes.io/projected/ac0af9de-1686-4d95-ba41-9be33a2eef83-kube-api-access-qbjgd\") pod \"keystone-bootstrap-rm2d7\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.404189 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.771481 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d07fdb46-4e94-42d7-a846-9cbf23dfc29f" path="/var/lib/kubelet/pods/d07fdb46-4e94-42d7-a846-9cbf23dfc29f/volumes" Sep 29 20:12:32 crc kubenswrapper[4780]: I0929 20:12:32.957387 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rm2d7"] Sep 29 20:12:33 crc kubenswrapper[4780]: I0929 20:12:33.897628 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rm2d7" event={"ID":"ac0af9de-1686-4d95-ba41-9be33a2eef83","Type":"ContainerStarted","Data":"3940e5c7fd4dfddd5ad8b601712b58dd670b0fe44dbb63398fdd18118b2c19ac"} Sep 29 20:12:33 crc kubenswrapper[4780]: I0929 20:12:33.898161 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rm2d7" event={"ID":"ac0af9de-1686-4d95-ba41-9be33a2eef83","Type":"ContainerStarted","Data":"14e44e06c958417e3c621d71c537908f405c88b2558eea8326aec68ca42bde14"} Sep 29 20:12:33 crc kubenswrapper[4780]: I0929 20:12:33.928648 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rm2d7" podStartSLOduration=1.928598034 podStartE2EDuration="1.928598034s" podCreationTimestamp="2025-09-29 20:12:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:12:33.918033981 +0000 UTC m=+5353.866332045" watchObservedRunningTime="2025-09-29 20:12:33.928598034 +0000 UTC m=+5353.876896088" Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.408336 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bcf577bd5-j6nqg" Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.484701 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7986d5d7c7-29fh6"] Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.485110 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" podUID="a0d67219-f8e3-4372-b3ca-dd003fe11375" containerName="dnsmasq-dns" containerID="cri-o://e9ab707808f5be1285d3889954d76ee347f1b0c9307204680ca56cecb158310b" gracePeriod=10 Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.946844 4780 generic.go:334] "Generic (PLEG): container finished" podID="a0d67219-f8e3-4372-b3ca-dd003fe11375" containerID="e9ab707808f5be1285d3889954d76ee347f1b0c9307204680ca56cecb158310b" exitCode=0 Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.947067 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" event={"ID":"a0d67219-f8e3-4372-b3ca-dd003fe11375","Type":"ContainerDied","Data":"e9ab707808f5be1285d3889954d76ee347f1b0c9307204680ca56cecb158310b"} Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.947233 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" event={"ID":"a0d67219-f8e3-4372-b3ca-dd003fe11375","Type":"ContainerDied","Data":"49346dec9a4dfa7a4b8e02dd3c4f27ddb5019ad4360a12d41a49522b2641f54d"} Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.947249 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49346dec9a4dfa7a4b8e02dd3c4f27ddb5019ad4360a12d41a49522b2641f54d" Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.954883 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.955512 4780 generic.go:334] "Generic (PLEG): container finished" podID="ac0af9de-1686-4d95-ba41-9be33a2eef83" containerID="3940e5c7fd4dfddd5ad8b601712b58dd670b0fe44dbb63398fdd18118b2c19ac" exitCode=0 Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.955551 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rm2d7" event={"ID":"ac0af9de-1686-4d95-ba41-9be33a2eef83","Type":"ContainerDied","Data":"3940e5c7fd4dfddd5ad8b601712b58dd670b0fe44dbb63398fdd18118b2c19ac"} Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.961515 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-nb\") pod \"a0d67219-f8e3-4372-b3ca-dd003fe11375\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.961601 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-sb\") pod \"a0d67219-f8e3-4372-b3ca-dd003fe11375\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.961655 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-config\") pod \"a0d67219-f8e3-4372-b3ca-dd003fe11375\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.961688 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-dns-svc\") pod \"a0d67219-f8e3-4372-b3ca-dd003fe11375\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.961717 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsf4c\" (UniqueName: \"kubernetes.io/projected/a0d67219-f8e3-4372-b3ca-dd003fe11375-kube-api-access-dsf4c\") pod \"a0d67219-f8e3-4372-b3ca-dd003fe11375\" (UID: \"a0d67219-f8e3-4372-b3ca-dd003fe11375\") " Sep 29 20:12:35 crc kubenswrapper[4780]: I0929 20:12:35.968376 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d67219-f8e3-4372-b3ca-dd003fe11375-kube-api-access-dsf4c" (OuterVolumeSpecName: "kube-api-access-dsf4c") pod "a0d67219-f8e3-4372-b3ca-dd003fe11375" (UID: "a0d67219-f8e3-4372-b3ca-dd003fe11375"). InnerVolumeSpecName "kube-api-access-dsf4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.015357 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-config" (OuterVolumeSpecName: "config") pod "a0d67219-f8e3-4372-b3ca-dd003fe11375" (UID: "a0d67219-f8e3-4372-b3ca-dd003fe11375"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.018886 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a0d67219-f8e3-4372-b3ca-dd003fe11375" (UID: "a0d67219-f8e3-4372-b3ca-dd003fe11375"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.031775 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0d67219-f8e3-4372-b3ca-dd003fe11375" (UID: "a0d67219-f8e3-4372-b3ca-dd003fe11375"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.045648 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a0d67219-f8e3-4372-b3ca-dd003fe11375" (UID: "a0d67219-f8e3-4372-b3ca-dd003fe11375"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.063663 4780 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-config\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.063695 4780 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.063705 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsf4c\" (UniqueName: \"kubernetes.io/projected/a0d67219-f8e3-4372-b3ca-dd003fe11375-kube-api-access-dsf4c\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.063718 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.063727 4780 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0d67219-f8e3-4372-b3ca-dd003fe11375-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:36 crc kubenswrapper[4780]: I0929 20:12:36.966114 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7986d5d7c7-29fh6" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.011397 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7986d5d7c7-29fh6"] Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.022506 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7986d5d7c7-29fh6"] Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.378108 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.493762 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-scripts\") pod \"ac0af9de-1686-4d95-ba41-9be33a2eef83\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.493903 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbjgd\" (UniqueName: \"kubernetes.io/projected/ac0af9de-1686-4d95-ba41-9be33a2eef83-kube-api-access-qbjgd\") pod \"ac0af9de-1686-4d95-ba41-9be33a2eef83\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.494016 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-combined-ca-bundle\") pod \"ac0af9de-1686-4d95-ba41-9be33a2eef83\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.494169 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-fernet-keys\") pod \"ac0af9de-1686-4d95-ba41-9be33a2eef83\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.494222 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-credential-keys\") pod \"ac0af9de-1686-4d95-ba41-9be33a2eef83\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.494282 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-config-data\") pod \"ac0af9de-1686-4d95-ba41-9be33a2eef83\" (UID: \"ac0af9de-1686-4d95-ba41-9be33a2eef83\") " Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.500424 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ac0af9de-1686-4d95-ba41-9be33a2eef83" (UID: "ac0af9de-1686-4d95-ba41-9be33a2eef83"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.501521 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-scripts" (OuterVolumeSpecName: "scripts") pod "ac0af9de-1686-4d95-ba41-9be33a2eef83" (UID: "ac0af9de-1686-4d95-ba41-9be33a2eef83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.501991 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0af9de-1686-4d95-ba41-9be33a2eef83-kube-api-access-qbjgd" (OuterVolumeSpecName: "kube-api-access-qbjgd") pod "ac0af9de-1686-4d95-ba41-9be33a2eef83" (UID: "ac0af9de-1686-4d95-ba41-9be33a2eef83"). InnerVolumeSpecName "kube-api-access-qbjgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.506619 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ac0af9de-1686-4d95-ba41-9be33a2eef83" (UID: "ac0af9de-1686-4d95-ba41-9be33a2eef83"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.525023 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-config-data" (OuterVolumeSpecName: "config-data") pod "ac0af9de-1686-4d95-ba41-9be33a2eef83" (UID: "ac0af9de-1686-4d95-ba41-9be33a2eef83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.530744 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac0af9de-1686-4d95-ba41-9be33a2eef83" (UID: "ac0af9de-1686-4d95-ba41-9be33a2eef83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.597504 4780 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.597536 4780 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.597550 4780 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-config-data\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.597596 4780 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-scripts\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.597669 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbjgd\" (UniqueName: \"kubernetes.io/projected/ac0af9de-1686-4d95-ba41-9be33a2eef83-kube-api-access-qbjgd\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.597687 4780 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0af9de-1686-4d95-ba41-9be33a2eef83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.992474 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rm2d7" event={"ID":"ac0af9de-1686-4d95-ba41-9be33a2eef83","Type":"ContainerDied","Data":"14e44e06c958417e3c621d71c537908f405c88b2558eea8326aec68ca42bde14"} Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.992538 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e44e06c958417e3c621d71c537908f405c88b2558eea8326aec68ca42bde14" Sep 29 20:12:37 crc kubenswrapper[4780]: I0929 20:12:37.992605 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rm2d7" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.132168 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-78bf57bfd6-ppq7g"] Sep 29 20:12:38 crc kubenswrapper[4780]: E0929 20:12:38.132636 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d67219-f8e3-4372-b3ca-dd003fe11375" containerName="init" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.132657 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d67219-f8e3-4372-b3ca-dd003fe11375" containerName="init" Sep 29 20:12:38 crc kubenswrapper[4780]: E0929 20:12:38.132676 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0af9de-1686-4d95-ba41-9be33a2eef83" containerName="keystone-bootstrap" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.132688 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0af9de-1686-4d95-ba41-9be33a2eef83" containerName="keystone-bootstrap" Sep 29 20:12:38 crc kubenswrapper[4780]: E0929 20:12:38.132731 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d67219-f8e3-4372-b3ca-dd003fe11375" containerName="dnsmasq-dns" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.132739 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d67219-f8e3-4372-b3ca-dd003fe11375" containerName="dnsmasq-dns" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.132921 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0af9de-1686-4d95-ba41-9be33a2eef83" containerName="keystone-bootstrap" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.132938 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d67219-f8e3-4372-b3ca-dd003fe11375" containerName="dnsmasq-dns" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.133692 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.142087 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.142113 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.142821 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.142849 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78bf57bfd6-ppq7g"] Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.142872 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.142888 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.143489 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mxsds" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.313095 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-internal-tls-certs\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.313522 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-public-tls-certs\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.313735 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-combined-ca-bundle\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.313908 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6smw\" (UniqueName: \"kubernetes.io/projected/29bf2166-fccc-4c96-b7bb-1ee954856bf5-kube-api-access-l6smw\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.314104 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-config-data\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.314344 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-scripts\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.314542 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-credential-keys\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.314710 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-fernet-keys\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.416182 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-public-tls-certs\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.416479 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-combined-ca-bundle\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.416519 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6smw\" (UniqueName: \"kubernetes.io/projected/29bf2166-fccc-4c96-b7bb-1ee954856bf5-kube-api-access-l6smw\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.416555 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-config-data\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.416587 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-scripts\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.416615 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-credential-keys\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.416636 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-fernet-keys\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.416652 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-internal-tls-certs\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.421969 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-fernet-keys\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.424610 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-internal-tls-certs\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.424842 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-config-data\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.427439 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-scripts\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.428584 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-public-tls-certs\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.431545 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-combined-ca-bundle\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.439178 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/29bf2166-fccc-4c96-b7bb-1ee954856bf5-credential-keys\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.447264 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6smw\" (UniqueName: \"kubernetes.io/projected/29bf2166-fccc-4c96-b7bb-1ee954856bf5-kube-api-access-l6smw\") pod \"keystone-78bf57bfd6-ppq7g\" (UID: \"29bf2166-fccc-4c96-b7bb-1ee954856bf5\") " pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.458783 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.767670 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d67219-f8e3-4372-b3ca-dd003fe11375" path="/var/lib/kubelet/pods/a0d67219-f8e3-4372-b3ca-dd003fe11375/volumes" Sep 29 20:12:38 crc kubenswrapper[4780]: I0929 20:12:38.956600 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78bf57bfd6-ppq7g"] Sep 29 20:12:39 crc kubenswrapper[4780]: I0929 20:12:39.011159 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78bf57bfd6-ppq7g" event={"ID":"29bf2166-fccc-4c96-b7bb-1ee954856bf5","Type":"ContainerStarted","Data":"c68a29e32f718a69b5aa0bf3c784b90227f66dc0d8cfe21f73ce2dd6acc1b1b5"} Sep 29 20:12:40 crc kubenswrapper[4780]: I0929 20:12:40.024973 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78bf57bfd6-ppq7g" event={"ID":"29bf2166-fccc-4c96-b7bb-1ee954856bf5","Type":"ContainerStarted","Data":"accfdc59d0fc2f7fbcdab715f3d71246764dfbcbff2dcdcbd5677e4035302f15"} Sep 29 20:12:40 crc kubenswrapper[4780]: I0929 20:12:40.025580 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:12:40 crc kubenswrapper[4780]: I0929 20:12:40.064783 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-78bf57bfd6-ppq7g" podStartSLOduration=2.064743833 podStartE2EDuration="2.064743833s" podCreationTimestamp="2025-09-29 20:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:12:40.061670235 +0000 UTC m=+5360.009968319" watchObservedRunningTime="2025-09-29 20:12:40.064743833 +0000 UTC m=+5360.013041917" Sep 29 20:13:09 crc kubenswrapper[4780]: I0929 20:13:09.955093 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-78bf57bfd6-ppq7g" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.751382 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.754973 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.758610 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-znkt5" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.759141 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.760929 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.772519 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.791816 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.795031 4780 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4e33510-0ebb-4101-8122-f1c1e24586f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T20:13:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T20:13:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T20:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-29T20:13:13Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:80b8547cf5821a4eb5461d1ac14edbc700ef03926268af960bf511647de027af\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2z97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-29T20:13:13Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Sep 29 20:13:13 crc kubenswrapper[4780]: E0929 20:13:13.803520 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-q2z97 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-q2z97 openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="b4e33510-0ebb-4101-8122-f1c1e24586f7" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.804840 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.864266 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.866947 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.871131 4780 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b4e33510-0ebb-4101-8122-f1c1e24586f7" podUID="7a7cd602-3896-4652-9764-b33305d9669d" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.876417 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.989298 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgqmz\" (UniqueName: \"kubernetes.io/projected/7a7cd602-3896-4652-9764-b33305d9669d-kube-api-access-sgqmz\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.989514 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7a7cd602-3896-4652-9764-b33305d9669d-openstack-config-secret\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.989702 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7cd602-3896-4652-9764-b33305d9669d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:13 crc kubenswrapper[4780]: I0929 20:13:13.989777 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7a7cd602-3896-4652-9764-b33305d9669d-openstack-config\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.091235 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgqmz\" (UniqueName: \"kubernetes.io/projected/7a7cd602-3896-4652-9764-b33305d9669d-kube-api-access-sgqmz\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.091340 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7a7cd602-3896-4652-9764-b33305d9669d-openstack-config-secret\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.091412 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7cd602-3896-4652-9764-b33305d9669d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.091453 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7a7cd602-3896-4652-9764-b33305d9669d-openstack-config\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.092444 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7a7cd602-3896-4652-9764-b33305d9669d-openstack-config\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.099119 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7a7cd602-3896-4652-9764-b33305d9669d-openstack-config-secret\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.099422 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7cd602-3896-4652-9764-b33305d9669d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.113826 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgqmz\" (UniqueName: \"kubernetes.io/projected/7a7cd602-3896-4652-9764-b33305d9669d-kube-api-access-sgqmz\") pod \"openstackclient\" (UID: \"7a7cd602-3896-4652-9764-b33305d9669d\") " pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.204276 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.439405 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.442299 4780 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b4e33510-0ebb-4101-8122-f1c1e24586f7" podUID="7a7cd602-3896-4652-9764-b33305d9669d" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.455092 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.458471 4780 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b4e33510-0ebb-4101-8122-f1c1e24586f7" podUID="7a7cd602-3896-4652-9764-b33305d9669d" Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.674533 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 29 20:13:14 crc kubenswrapper[4780]: I0929 20:13:14.770562 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e33510-0ebb-4101-8122-f1c1e24586f7" path="/var/lib/kubelet/pods/b4e33510-0ebb-4101-8122-f1c1e24586f7/volumes" Sep 29 20:13:15 crc kubenswrapper[4780]: I0929 20:13:15.450140 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 29 20:13:15 crc kubenswrapper[4780]: I0929 20:13:15.450221 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7a7cd602-3896-4652-9764-b33305d9669d","Type":"ContainerStarted","Data":"f5ff1adeb5f794a98d102fa42c5b1b180abf75c657b06111b559ab76a4ba4b5e"} Sep 29 20:13:15 crc kubenswrapper[4780]: I0929 20:13:15.450652 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7a7cd602-3896-4652-9764-b33305d9669d","Type":"ContainerStarted","Data":"7d72c164303cebb25fe27a7e9a74fb4069df7414f96522e453f89291f33111f4"} Sep 29 20:13:15 crc kubenswrapper[4780]: I0929 20:13:15.474338 4780 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b4e33510-0ebb-4101-8122-f1c1e24586f7" podUID="7a7cd602-3896-4652-9764-b33305d9669d" Sep 29 20:13:15 crc kubenswrapper[4780]: I0929 20:13:15.479990 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.479969601 podStartE2EDuration="2.479969601s" podCreationTimestamp="2025-09-29 20:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:13:15.469995039 +0000 UTC m=+5395.418293113" watchObservedRunningTime="2025-09-29 20:13:15.479969601 +0000 UTC m=+5395.428267655" Sep 29 20:14:03 crc kubenswrapper[4780]: I0929 20:14:03.223777 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:14:03 crc kubenswrapper[4780]: I0929 20:14:03.224359 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:14:27 crc kubenswrapper[4780]: I0929 20:14:27.801033 4780 scope.go:117] "RemoveContainer" containerID="6aa52cc784f9990a21bf658a4fdb458f00aec85dcc0246dc86e67b23cce2e165" Sep 29 20:14:27 crc kubenswrapper[4780]: I0929 20:14:27.831772 4780 scope.go:117] "RemoveContainer" containerID="9310dbff183335c1f7278fa86f7cc8bf66b096f6eb2ff0028846b2903ef2d08a" Sep 29 20:14:27 crc kubenswrapper[4780]: I0929 20:14:27.894327 4780 scope.go:117] "RemoveContainer" containerID="55b24169fd6f112ae7bc335f57093ec95562a5188a6c8834c7ab3de95933b579" Sep 29 20:14:27 crc kubenswrapper[4780]: I0929 20:14:27.940988 4780 scope.go:117] "RemoveContainer" containerID="db8e9ca3dcc446f1d34ff93b944814248b5d82fb10c7588cc1a295c01bb674a7" Sep 29 20:14:28 crc kubenswrapper[4780]: I0929 20:14:28.000313 4780 scope.go:117] "RemoveContainer" containerID="781f7b6b7335269720839f04ca793aaef6cdc37db4aae5835f8da5977964224e" Sep 29 20:14:28 crc kubenswrapper[4780]: I0929 20:14:28.046989 4780 scope.go:117] "RemoveContainer" containerID="7eabb7045a672a44c7c8dd1326facae09d8047f2d4932a74bb92829e3c67f6fe" Sep 29 20:14:28 crc kubenswrapper[4780]: I0929 20:14:28.082549 4780 scope.go:117] "RemoveContainer" containerID="eca1d1effa6b407393527a27880255d5f98319d0367241d8bdecd7924be6bc13" Sep 29 20:14:33 crc kubenswrapper[4780]: I0929 20:14:33.223274 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:14:33 crc kubenswrapper[4780]: I0929 20:14:33.225672 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:14:42 crc kubenswrapper[4780]: E0929 20:14:42.559924 4780 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.80:40064->38.102.83.80:37067: write tcp 38.102.83.80:40064->38.102.83.80:37067: write: broken pipe Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.568196 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2z4m6"] Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.578552 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.587477 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2z4m6"] Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.694792 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7jdg\" (UniqueName: \"kubernetes.io/projected/8bd7b8b8-dc53-45d4-9386-79a703f57174-kube-api-access-t7jdg\") pod \"redhat-operators-2z4m6\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.694949 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-catalog-content\") pod \"redhat-operators-2z4m6\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.694991 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-utilities\") pod \"redhat-operators-2z4m6\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.796811 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7jdg\" (UniqueName: \"kubernetes.io/projected/8bd7b8b8-dc53-45d4-9386-79a703f57174-kube-api-access-t7jdg\") pod \"redhat-operators-2z4m6\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.796920 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-catalog-content\") pod \"redhat-operators-2z4m6\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.796949 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-utilities\") pod \"redhat-operators-2z4m6\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.797360 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-utilities\") pod \"redhat-operators-2z4m6\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.797714 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-catalog-content\") pod \"redhat-operators-2z4m6\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.821380 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7jdg\" (UniqueName: \"kubernetes.io/projected/8bd7b8b8-dc53-45d4-9386-79a703f57174-kube-api-access-t7jdg\") pod \"redhat-operators-2z4m6\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:47 crc kubenswrapper[4780]: I0929 20:14:47.934091 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:48 crc kubenswrapper[4780]: W0929 20:14:48.392806 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bd7b8b8_dc53_45d4_9386_79a703f57174.slice/crio-b1efb508247452a6d8b9654b54ffc6f9794606d8c00c9e8803cdd7b9997695c9 WatchSource:0}: Error finding container b1efb508247452a6d8b9654b54ffc6f9794606d8c00c9e8803cdd7b9997695c9: Status 404 returned error can't find the container with id b1efb508247452a6d8b9654b54ffc6f9794606d8c00c9e8803cdd7b9997695c9 Sep 29 20:14:48 crc kubenswrapper[4780]: I0929 20:14:48.395205 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2z4m6"] Sep 29 20:14:48 crc kubenswrapper[4780]: I0929 20:14:48.525039 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z4m6" event={"ID":"8bd7b8b8-dc53-45d4-9386-79a703f57174","Type":"ContainerStarted","Data":"b1efb508247452a6d8b9654b54ffc6f9794606d8c00c9e8803cdd7b9997695c9"} Sep 29 20:14:49 crc kubenswrapper[4780]: I0929 20:14:49.538653 4780 generic.go:334] "Generic (PLEG): container finished" podID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerID="d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113" exitCode=0 Sep 29 20:14:49 crc kubenswrapper[4780]: I0929 20:14:49.538766 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z4m6" event={"ID":"8bd7b8b8-dc53-45d4-9386-79a703f57174","Type":"ContainerDied","Data":"d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113"} Sep 29 20:14:51 crc kubenswrapper[4780]: I0929 20:14:51.564135 4780 generic.go:334] "Generic (PLEG): container finished" podID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerID="0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1" exitCode=0 Sep 29 20:14:51 crc kubenswrapper[4780]: I0929 20:14:51.564205 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z4m6" event={"ID":"8bd7b8b8-dc53-45d4-9386-79a703f57174","Type":"ContainerDied","Data":"0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1"} Sep 29 20:14:52 crc kubenswrapper[4780]: I0929 20:14:52.577775 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z4m6" event={"ID":"8bd7b8b8-dc53-45d4-9386-79a703f57174","Type":"ContainerStarted","Data":"a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34"} Sep 29 20:14:52 crc kubenswrapper[4780]: I0929 20:14:52.615905 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2z4m6" podStartSLOduration=3.056243393 podStartE2EDuration="5.615867559s" podCreationTimestamp="2025-09-29 20:14:47 +0000 UTC" firstStartedPulling="2025-09-29 20:14:49.541865309 +0000 UTC m=+5489.490163383" lastFinishedPulling="2025-09-29 20:14:52.101489475 +0000 UTC m=+5492.049787549" observedRunningTime="2025-09-29 20:14:52.601415661 +0000 UTC m=+5492.549713725" watchObservedRunningTime="2025-09-29 20:14:52.615867559 +0000 UTC m=+5492.564165653" Sep 29 20:14:57 crc kubenswrapper[4780]: I0929 20:14:57.934196 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:57 crc kubenswrapper[4780]: I0929 20:14:57.934965 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:14:59 crc kubenswrapper[4780]: I0929 20:14:59.023866 4780 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2z4m6" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerName="registry-server" probeResult="failure" output=< Sep 29 20:14:59 crc kubenswrapper[4780]: timeout: failed to connect service ":50051" within 1s Sep 29 20:14:59 crc kubenswrapper[4780]: > Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.173353 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc"] Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.175217 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.178939 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.179209 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.186235 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc"] Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.234930 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8a462ed-4f44-45b2-b103-a29377206105-secret-volume\") pod \"collect-profiles-29319615-6b4cc\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.235232 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxfts\" (UniqueName: \"kubernetes.io/projected/f8a462ed-4f44-45b2-b103-a29377206105-kube-api-access-bxfts\") pod \"collect-profiles-29319615-6b4cc\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.235333 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8a462ed-4f44-45b2-b103-a29377206105-config-volume\") pod \"collect-profiles-29319615-6b4cc\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.337611 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8a462ed-4f44-45b2-b103-a29377206105-secret-volume\") pod \"collect-profiles-29319615-6b4cc\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.337772 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxfts\" (UniqueName: \"kubernetes.io/projected/f8a462ed-4f44-45b2-b103-a29377206105-kube-api-access-bxfts\") pod \"collect-profiles-29319615-6b4cc\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.337826 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8a462ed-4f44-45b2-b103-a29377206105-config-volume\") pod \"collect-profiles-29319615-6b4cc\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.339116 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8a462ed-4f44-45b2-b103-a29377206105-config-volume\") pod \"collect-profiles-29319615-6b4cc\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.352364 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8a462ed-4f44-45b2-b103-a29377206105-secret-volume\") pod \"collect-profiles-29319615-6b4cc\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.359245 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxfts\" (UniqueName: \"kubernetes.io/projected/f8a462ed-4f44-45b2-b103-a29377206105-kube-api-access-bxfts\") pod \"collect-profiles-29319615-6b4cc\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:00 crc kubenswrapper[4780]: I0929 20:15:00.507357 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:01 crc kubenswrapper[4780]: I0929 20:15:01.102585 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc"] Sep 29 20:15:01 crc kubenswrapper[4780]: I0929 20:15:01.702427 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" event={"ID":"f8a462ed-4f44-45b2-b103-a29377206105","Type":"ContainerStarted","Data":"7ab6b022de779ccba19b1e834e72514ae6e852e97f331efb65935c93e6a63383"} Sep 29 20:15:01 crc kubenswrapper[4780]: I0929 20:15:01.702810 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" event={"ID":"f8a462ed-4f44-45b2-b103-a29377206105","Type":"ContainerStarted","Data":"7941218b542478d359912596e96326e80e43890ac0e0d875dc50fc674aa8f748"} Sep 29 20:15:01 crc kubenswrapper[4780]: I0929 20:15:01.727584 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" podStartSLOduration=1.727564642 podStartE2EDuration="1.727564642s" podCreationTimestamp="2025-09-29 20:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:15:01.721536502 +0000 UTC m=+5501.669834576" watchObservedRunningTime="2025-09-29 20:15:01.727564642 +0000 UTC m=+5501.675862696" Sep 29 20:15:02 crc kubenswrapper[4780]: I0929 20:15:02.714174 4780 generic.go:334] "Generic (PLEG): container finished" podID="f8a462ed-4f44-45b2-b103-a29377206105" containerID="7ab6b022de779ccba19b1e834e72514ae6e852e97f331efb65935c93e6a63383" exitCode=0 Sep 29 20:15:02 crc kubenswrapper[4780]: I0929 20:15:02.714229 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" event={"ID":"f8a462ed-4f44-45b2-b103-a29377206105","Type":"ContainerDied","Data":"7ab6b022de779ccba19b1e834e72514ae6e852e97f331efb65935c93e6a63383"} Sep 29 20:15:03 crc kubenswrapper[4780]: I0929 20:15:03.224046 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:15:03 crc kubenswrapper[4780]: I0929 20:15:03.224150 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:15:03 crc kubenswrapper[4780]: I0929 20:15:03.224203 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 20:15:03 crc kubenswrapper[4780]: I0929 20:15:03.225168 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c7d0867cfa2f7173f305d5b92aa3ffce4dd0a0e42d21fcd3573872eb7ac90e5"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 20:15:03 crc kubenswrapper[4780]: I0929 20:15:03.225273 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://3c7d0867cfa2f7173f305d5b92aa3ffce4dd0a0e42d21fcd3573872eb7ac90e5" gracePeriod=600 Sep 29 20:15:03 crc kubenswrapper[4780]: I0929 20:15:03.726958 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="3c7d0867cfa2f7173f305d5b92aa3ffce4dd0a0e42d21fcd3573872eb7ac90e5" exitCode=0 Sep 29 20:15:03 crc kubenswrapper[4780]: I0929 20:15:03.727035 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"3c7d0867cfa2f7173f305d5b92aa3ffce4dd0a0e42d21fcd3573872eb7ac90e5"} Sep 29 20:15:03 crc kubenswrapper[4780]: I0929 20:15:03.727395 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerStarted","Data":"930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba"} Sep 29 20:15:03 crc kubenswrapper[4780]: I0929 20:15:03.727445 4780 scope.go:117] "RemoveContainer" containerID="5eb1ca00ce3c073ee939ef0e7431d747c4975a514c39fd58590a0035cbc7c4af" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.138324 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.208472 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8a462ed-4f44-45b2-b103-a29377206105-secret-volume\") pod \"f8a462ed-4f44-45b2-b103-a29377206105\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.208585 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxfts\" (UniqueName: \"kubernetes.io/projected/f8a462ed-4f44-45b2-b103-a29377206105-kube-api-access-bxfts\") pod \"f8a462ed-4f44-45b2-b103-a29377206105\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.208657 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8a462ed-4f44-45b2-b103-a29377206105-config-volume\") pod \"f8a462ed-4f44-45b2-b103-a29377206105\" (UID: \"f8a462ed-4f44-45b2-b103-a29377206105\") " Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.209740 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a462ed-4f44-45b2-b103-a29377206105-config-volume" (OuterVolumeSpecName: "config-volume") pod "f8a462ed-4f44-45b2-b103-a29377206105" (UID: "f8a462ed-4f44-45b2-b103-a29377206105"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.215218 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a462ed-4f44-45b2-b103-a29377206105-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f8a462ed-4f44-45b2-b103-a29377206105" (UID: "f8a462ed-4f44-45b2-b103-a29377206105"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.215483 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a462ed-4f44-45b2-b103-a29377206105-kube-api-access-bxfts" (OuterVolumeSpecName: "kube-api-access-bxfts") pod "f8a462ed-4f44-45b2-b103-a29377206105" (UID: "f8a462ed-4f44-45b2-b103-a29377206105"). InnerVolumeSpecName "kube-api-access-bxfts". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.310465 4780 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8a462ed-4f44-45b2-b103-a29377206105-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.310512 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxfts\" (UniqueName: \"kubernetes.io/projected/f8a462ed-4f44-45b2-b103-a29377206105-kube-api-access-bxfts\") on node \"crc\" DevicePath \"\"" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.310526 4780 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8a462ed-4f44-45b2-b103-a29377206105-config-volume\") on node \"crc\" DevicePath \"\"" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.741585 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" event={"ID":"f8a462ed-4f44-45b2-b103-a29377206105","Type":"ContainerDied","Data":"7941218b542478d359912596e96326e80e43890ac0e0d875dc50fc674aa8f748"} Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.741944 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7941218b542478d359912596e96326e80e43890ac0e0d875dc50fc674aa8f748" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.741686 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29319615-6b4cc" Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.838565 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp"] Sep 29 20:15:04 crc kubenswrapper[4780]: I0929 20:15:04.849184 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29319570-d9clp"] Sep 29 20:15:06 crc kubenswrapper[4780]: I0929 20:15:06.774948 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efef81a5-564e-46ca-b4ed-6bb53ffa4c23" path="/var/lib/kubelet/pods/efef81a5-564e-46ca-b4ed-6bb53ffa4c23/volumes" Sep 29 20:15:08 crc kubenswrapper[4780]: I0929 20:15:08.027467 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:15:08 crc kubenswrapper[4780]: I0929 20:15:08.122530 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:15:08 crc kubenswrapper[4780]: I0929 20:15:08.276720 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2z4m6"] Sep 29 20:15:09 crc kubenswrapper[4780]: I0929 20:15:09.797491 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2z4m6" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerName="registry-server" containerID="cri-o://a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34" gracePeriod=2 Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.358281 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.430922 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-catalog-content\") pod \"8bd7b8b8-dc53-45d4-9386-79a703f57174\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.431150 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7jdg\" (UniqueName: \"kubernetes.io/projected/8bd7b8b8-dc53-45d4-9386-79a703f57174-kube-api-access-t7jdg\") pod \"8bd7b8b8-dc53-45d4-9386-79a703f57174\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.431231 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-utilities\") pod \"8bd7b8b8-dc53-45d4-9386-79a703f57174\" (UID: \"8bd7b8b8-dc53-45d4-9386-79a703f57174\") " Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.432721 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-utilities" (OuterVolumeSpecName: "utilities") pod "8bd7b8b8-dc53-45d4-9386-79a703f57174" (UID: "8bd7b8b8-dc53-45d4-9386-79a703f57174"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.439808 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd7b8b8-dc53-45d4-9386-79a703f57174-kube-api-access-t7jdg" (OuterVolumeSpecName: "kube-api-access-t7jdg") pod "8bd7b8b8-dc53-45d4-9386-79a703f57174" (UID: "8bd7b8b8-dc53-45d4-9386-79a703f57174"). InnerVolumeSpecName "kube-api-access-t7jdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.531722 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bd7b8b8-dc53-45d4-9386-79a703f57174" (UID: "8bd7b8b8-dc53-45d4-9386-79a703f57174"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.532681 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.532702 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7jdg\" (UniqueName: \"kubernetes.io/projected/8bd7b8b8-dc53-45d4-9386-79a703f57174-kube-api-access-t7jdg\") on node \"crc\" DevicePath \"\"" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.532715 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bd7b8b8-dc53-45d4-9386-79a703f57174-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.808977 4780 generic.go:334] "Generic (PLEG): container finished" podID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerID="a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34" exitCode=0 Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.809015 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z4m6" event={"ID":"8bd7b8b8-dc53-45d4-9386-79a703f57174","Type":"ContainerDied","Data":"a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34"} Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.809038 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z4m6" event={"ID":"8bd7b8b8-dc53-45d4-9386-79a703f57174","Type":"ContainerDied","Data":"b1efb508247452a6d8b9654b54ffc6f9794606d8c00c9e8803cdd7b9997695c9"} Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.809068 4780 scope.go:117] "RemoveContainer" containerID="a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.809119 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z4m6" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.845026 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2z4m6"] Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.852589 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2z4m6"] Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.854383 4780 scope.go:117] "RemoveContainer" containerID="0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.883454 4780 scope.go:117] "RemoveContainer" containerID="d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.927980 4780 scope.go:117] "RemoveContainer" containerID="a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34" Sep 29 20:15:10 crc kubenswrapper[4780]: E0929 20:15:10.928618 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34\": container with ID starting with a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34 not found: ID does not exist" containerID="a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.928685 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34"} err="failed to get container status \"a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34\": rpc error: code = NotFound desc = could not find container \"a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34\": container with ID starting with a9a4ecfeddc995582b11393a256b1c8a5db7a27e9796e87ab0a97ccd00199b34 not found: ID does not exist" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.928727 4780 scope.go:117] "RemoveContainer" containerID="0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1" Sep 29 20:15:10 crc kubenswrapper[4780]: E0929 20:15:10.929516 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1\": container with ID starting with 0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1 not found: ID does not exist" containerID="0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.929643 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1"} err="failed to get container status \"0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1\": rpc error: code = NotFound desc = could not find container \"0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1\": container with ID starting with 0d68379cdde97474f014d3cbf3c5c5454a7a7db9ee663a77ddc895870b276be1 not found: ID does not exist" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.929771 4780 scope.go:117] "RemoveContainer" containerID="d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113" Sep 29 20:15:10 crc kubenswrapper[4780]: E0929 20:15:10.930375 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113\": container with ID starting with d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113 not found: ID does not exist" containerID="d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113" Sep 29 20:15:10 crc kubenswrapper[4780]: I0929 20:15:10.930440 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113"} err="failed to get container status \"d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113\": rpc error: code = NotFound desc = could not find container \"d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113\": container with ID starting with d5ad9a2b9214cf224e259661040dfdfcbec7edec8f90594acf8f457c562a3113 not found: ID does not exist" Sep 29 20:15:12 crc kubenswrapper[4780]: I0929 20:15:12.772256 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" path="/var/lib/kubelet/pods/8bd7b8b8-dc53-45d4-9386-79a703f57174/volumes" Sep 29 20:15:28 crc kubenswrapper[4780]: I0929 20:15:28.240089 4780 scope.go:117] "RemoveContainer" containerID="42d2da8bd993b5df9a17139b927edc1c108a66a8827e246fcc3495b111a7323d" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.321342 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cn2gq/must-gather-rqkr8"] Sep 29 20:15:37 crc kubenswrapper[4780]: E0929 20:15:37.322019 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerName="registry-server" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.322031 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerName="registry-server" Sep 29 20:15:37 crc kubenswrapper[4780]: E0929 20:15:37.322075 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a462ed-4f44-45b2-b103-a29377206105" containerName="collect-profiles" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.322081 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a462ed-4f44-45b2-b103-a29377206105" containerName="collect-profiles" Sep 29 20:15:37 crc kubenswrapper[4780]: E0929 20:15:37.322092 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerName="extract-utilities" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.322099 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerName="extract-utilities" Sep 29 20:15:37 crc kubenswrapper[4780]: E0929 20:15:37.322108 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerName="extract-content" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.322114 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerName="extract-content" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.322259 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a462ed-4f44-45b2-b103-a29377206105" containerName="collect-profiles" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.322276 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd7b8b8-dc53-45d4-9386-79a703f57174" containerName="registry-server" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.323048 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.325267 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-cn2gq"/"default-dockercfg-cfqcw" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.325611 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cn2gq"/"kube-root-ca.crt" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.326525 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cn2gq"/"openshift-service-ca.crt" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.334763 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cn2gq/must-gather-rqkr8"] Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.455503 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqd6p\" (UniqueName: \"kubernetes.io/projected/76413d88-7478-4108-8f6f-c63fa89eb825-kube-api-access-fqd6p\") pod \"must-gather-rqkr8\" (UID: \"76413d88-7478-4108-8f6f-c63fa89eb825\") " pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.455594 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76413d88-7478-4108-8f6f-c63fa89eb825-must-gather-output\") pod \"must-gather-rqkr8\" (UID: \"76413d88-7478-4108-8f6f-c63fa89eb825\") " pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.556985 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqd6p\" (UniqueName: \"kubernetes.io/projected/76413d88-7478-4108-8f6f-c63fa89eb825-kube-api-access-fqd6p\") pod \"must-gather-rqkr8\" (UID: \"76413d88-7478-4108-8f6f-c63fa89eb825\") " pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.557084 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76413d88-7478-4108-8f6f-c63fa89eb825-must-gather-output\") pod \"must-gather-rqkr8\" (UID: \"76413d88-7478-4108-8f6f-c63fa89eb825\") " pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.557712 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76413d88-7478-4108-8f6f-c63fa89eb825-must-gather-output\") pod \"must-gather-rqkr8\" (UID: \"76413d88-7478-4108-8f6f-c63fa89eb825\") " pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.574679 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqd6p\" (UniqueName: \"kubernetes.io/projected/76413d88-7478-4108-8f6f-c63fa89eb825-kube-api-access-fqd6p\") pod \"must-gather-rqkr8\" (UID: \"76413d88-7478-4108-8f6f-c63fa89eb825\") " pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:15:37 crc kubenswrapper[4780]: I0929 20:15:37.641021 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:15:38 crc kubenswrapper[4780]: I0929 20:15:38.135020 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cn2gq/must-gather-rqkr8"] Sep 29 20:15:39 crc kubenswrapper[4780]: I0929 20:15:39.111376 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" event={"ID":"76413d88-7478-4108-8f6f-c63fa89eb825","Type":"ContainerStarted","Data":"af26b8b42c33c5a04a8bb939ebd03b6c8a20d0ff86a4591f25abcab8e681682d"} Sep 29 20:15:43 crc kubenswrapper[4780]: I0929 20:15:43.179079 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" event={"ID":"76413d88-7478-4108-8f6f-c63fa89eb825","Type":"ContainerStarted","Data":"57208cde0ea680b50e0b86894dae78af7dd9f3318d8b5cc3a73d427dd1824f58"} Sep 29 20:15:43 crc kubenswrapper[4780]: I0929 20:15:43.179762 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" event={"ID":"76413d88-7478-4108-8f6f-c63fa89eb825","Type":"ContainerStarted","Data":"eb961d32f4b3107a02d0f461f85dbb79eef539acb7d190bdf55bb5bb7215c85a"} Sep 29 20:15:43 crc kubenswrapper[4780]: I0929 20:15:43.201456 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" podStartSLOduration=2.329895023 podStartE2EDuration="6.201430759s" podCreationTimestamp="2025-09-29 20:15:37 +0000 UTC" firstStartedPulling="2025-09-29 20:15:38.151026233 +0000 UTC m=+5538.099324287" lastFinishedPulling="2025-09-29 20:15:42.022561979 +0000 UTC m=+5541.970860023" observedRunningTime="2025-09-29 20:15:43.200296797 +0000 UTC m=+5543.148594871" watchObservedRunningTime="2025-09-29 20:15:43.201430759 +0000 UTC m=+5543.149728843" Sep 29 20:15:45 crc kubenswrapper[4780]: I0929 20:15:45.226886 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cn2gq/crc-debug-99rb4"] Sep 29 20:15:45 crc kubenswrapper[4780]: I0929 20:15:45.228398 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:15:45 crc kubenswrapper[4780]: I0929 20:15:45.261643 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57nqb\" (UniqueName: \"kubernetes.io/projected/e75b087b-b8e3-4926-9d30-c66d6e54eab9-kube-api-access-57nqb\") pod \"crc-debug-99rb4\" (UID: \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\") " pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:15:45 crc kubenswrapper[4780]: I0929 20:15:45.261718 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e75b087b-b8e3-4926-9d30-c66d6e54eab9-host\") pod \"crc-debug-99rb4\" (UID: \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\") " pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:15:45 crc kubenswrapper[4780]: I0929 20:15:45.363733 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57nqb\" (UniqueName: \"kubernetes.io/projected/e75b087b-b8e3-4926-9d30-c66d6e54eab9-kube-api-access-57nqb\") pod \"crc-debug-99rb4\" (UID: \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\") " pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:15:45 crc kubenswrapper[4780]: I0929 20:15:45.363796 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e75b087b-b8e3-4926-9d30-c66d6e54eab9-host\") pod \"crc-debug-99rb4\" (UID: \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\") " pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:15:45 crc kubenswrapper[4780]: I0929 20:15:45.363927 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e75b087b-b8e3-4926-9d30-c66d6e54eab9-host\") pod \"crc-debug-99rb4\" (UID: \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\") " pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:15:45 crc kubenswrapper[4780]: I0929 20:15:45.386488 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57nqb\" (UniqueName: \"kubernetes.io/projected/e75b087b-b8e3-4926-9d30-c66d6e54eab9-kube-api-access-57nqb\") pod \"crc-debug-99rb4\" (UID: \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\") " pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:15:45 crc kubenswrapper[4780]: I0929 20:15:45.544389 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:15:46 crc kubenswrapper[4780]: I0929 20:15:46.207349 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-99rb4" event={"ID":"e75b087b-b8e3-4926-9d30-c66d6e54eab9","Type":"ContainerStarted","Data":"842039b8b1f559f6b86d65f419b4532a46dfb6a75f4d78949964bc85bfc4433f"} Sep 29 20:15:57 crc kubenswrapper[4780]: I0929 20:15:57.308152 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-99rb4" event={"ID":"e75b087b-b8e3-4926-9d30-c66d6e54eab9","Type":"ContainerStarted","Data":"97d4ecb0cfdecfbeda64e76a3f5a2165131d1f4745e931d7688c086c26256aaf"} Sep 29 20:15:57 crc kubenswrapper[4780]: I0929 20:15:57.320595 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cn2gq/crc-debug-99rb4" podStartSLOduration=1.459064697 podStartE2EDuration="12.320580453s" podCreationTimestamp="2025-09-29 20:15:45 +0000 UTC" firstStartedPulling="2025-09-29 20:15:45.570870041 +0000 UTC m=+5545.519168085" lastFinishedPulling="2025-09-29 20:15:56.432385797 +0000 UTC m=+5556.380683841" observedRunningTime="2025-09-29 20:15:57.318841324 +0000 UTC m=+5557.267139368" watchObservedRunningTime="2025-09-29 20:15:57.320580453 +0000 UTC m=+5557.268878497" Sep 29 20:16:29 crc kubenswrapper[4780]: I0929 20:16:29.627636 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bcf577bd5-j6nqg_34a6eac5-da6b-40fa-a11f-42301b421306/init/0.log" Sep 29 20:16:29 crc kubenswrapper[4780]: I0929 20:16:29.708161 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bcf577bd5-j6nqg_34a6eac5-da6b-40fa-a11f-42301b421306/init/0.log" Sep 29 20:16:29 crc kubenswrapper[4780]: I0929 20:16:29.731984 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bcf577bd5-j6nqg_34a6eac5-da6b-40fa-a11f-42301b421306/dnsmasq-dns/0.log" Sep 29 20:16:29 crc kubenswrapper[4780]: I0929 20:16:29.885433 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-78bf57bfd6-ppq7g_29bf2166-fccc-4c96-b7bb-1ee954856bf5/keystone-api/0.log" Sep 29 20:16:30 crc kubenswrapper[4780]: I0929 20:16:30.042682 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-rm2d7_ac0af9de-1686-4d95-ba41-9be33a2eef83/keystone-bootstrap/0.log" Sep 29 20:16:30 crc kubenswrapper[4780]: I0929 20:16:30.176293 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c775-account-create-qq5ml_b8cb3bd4-36b7-4431-892d-3972efabd631/mariadb-account-create/0.log" Sep 29 20:16:30 crc kubenswrapper[4780]: I0929 20:16:30.326209 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-qcgmn_5c0791ee-7613-491e-940b-b4810ca0be6f/mariadb-database-create/0.log" Sep 29 20:16:30 crc kubenswrapper[4780]: I0929 20:16:30.471337 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-xfsht_79032b7c-f044-406a-953d-2378bbb62e8b/keystone-db-sync/0.log" Sep 29 20:16:30 crc kubenswrapper[4780]: I0929 20:16:30.596482 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_d2556c4e-903c-4377-97eb-0eb017939756/adoption/0.log" Sep 29 20:16:30 crc kubenswrapper[4780]: I0929 20:16:30.894064 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98e8d940-9717-43b2-9919-f12c4218b3f4/mysql-bootstrap/0.log" Sep 29 20:16:31 crc kubenswrapper[4780]: I0929 20:16:31.130415 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98e8d940-9717-43b2-9919-f12c4218b3f4/mysql-bootstrap/0.log" Sep 29 20:16:31 crc kubenswrapper[4780]: I0929 20:16:31.132275 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98e8d940-9717-43b2-9919-f12c4218b3f4/galera/0.log" Sep 29 20:16:31 crc kubenswrapper[4780]: I0929 20:16:31.337598 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ec488a3f-cd31-4c53-817e-22c302ab7678/mysql-bootstrap/0.log" Sep 29 20:16:31 crc kubenswrapper[4780]: I0929 20:16:31.570864 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ec488a3f-cd31-4c53-817e-22c302ab7678/mysql-bootstrap/0.log" Sep 29 20:16:31 crc kubenswrapper[4780]: I0929 20:16:31.595472 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ec488a3f-cd31-4c53-817e-22c302ab7678/galera/0.log" Sep 29 20:16:31 crc kubenswrapper[4780]: I0929 20:16:31.620612 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3e5b137b-69eb-4e58-98fb-d7a4afe639c8/memcached/0.log" Sep 29 20:16:31 crc kubenswrapper[4780]: I0929 20:16:31.751588 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7a7cd602-3896-4652-9764-b33305d9669d/openstackclient/0.log" Sep 29 20:16:31 crc kubenswrapper[4780]: I0929 20:16:31.935953 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_6ddadfee-0076-4181-aeff-aace0c0ffb1f/adoption/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.025088 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6fb0ef5f-8130-43d6-b9b5-fe5c480f842b/openstack-network-exporter/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.130468 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6fb0ef5f-8130-43d6-b9b5-fe5c480f842b/ovn-northd/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.210519 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bcafc040-9d77-4ea9-9515-cba560d9a6ca/openstack-network-exporter/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.379563 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bcafc040-9d77-4ea9-9515-cba560d9a6ca/ovsdbserver-nb/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.429810 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_84151286-7e33-474f-9247-d1222cae1067/openstack-network-exporter/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.552555 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_84151286-7e33-474f-9247-d1222cae1067/ovsdbserver-nb/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.652995 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_493a3b25-de4a-438c-b19e-210f6618c08d/openstack-network-exporter/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.675319 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_493a3b25-de4a-438c-b19e-210f6618c08d/ovsdbserver-nb/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.806625 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ac41da62-3218-48ed-b10d-144bf1f0e85f/openstack-network-exporter/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.874383 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ac41da62-3218-48ed-b10d-144bf1f0e85f/ovsdbserver-sb/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.987427 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_fc6329f7-7f65-48b4-9a99-3fe225456e58/ovsdbserver-sb/0.log" Sep 29 20:16:32 crc kubenswrapper[4780]: I0929 20:16:32.995522 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_fc6329f7-7f65-48b4-9a99-3fe225456e58/openstack-network-exporter/0.log" Sep 29 20:16:33 crc kubenswrapper[4780]: I0929 20:16:33.096506 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_0e63d364-5147-4383-ae55-fde7de3b1894/openstack-network-exporter/0.log" Sep 29 20:16:33 crc kubenswrapper[4780]: I0929 20:16:33.151134 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_0e63d364-5147-4383-ae55-fde7de3b1894/ovsdbserver-sb/0.log" Sep 29 20:16:33 crc kubenswrapper[4780]: I0929 20:16:33.284889 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dd09e83d-ac55-42fb-9d0c-b84c5d12c284/setup-container/0.log" Sep 29 20:16:33 crc kubenswrapper[4780]: I0929 20:16:33.441707 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dd09e83d-ac55-42fb-9d0c-b84c5d12c284/rabbitmq/0.log" Sep 29 20:16:33 crc kubenswrapper[4780]: I0929 20:16:33.457770 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dd09e83d-ac55-42fb-9d0c-b84c5d12c284/setup-container/0.log" Sep 29 20:16:33 crc kubenswrapper[4780]: I0929 20:16:33.511321 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_82d758e0-ddd1-4c96-bfa9-bd81f14359ac/setup-container/0.log" Sep 29 20:16:33 crc kubenswrapper[4780]: I0929 20:16:33.680533 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_82d758e0-ddd1-4c96-bfa9-bd81f14359ac/setup-container/0.log" Sep 29 20:16:33 crc kubenswrapper[4780]: I0929 20:16:33.755999 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_82d758e0-ddd1-4c96-bfa9-bd81f14359ac/rabbitmq/0.log" Sep 29 20:17:03 crc kubenswrapper[4780]: I0929 20:17:03.225015 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:17:03 crc kubenswrapper[4780]: I0929 20:17:03.225880 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:17:19 crc kubenswrapper[4780]: I0929 20:17:19.052385 4780 generic.go:334] "Generic (PLEG): container finished" podID="e75b087b-b8e3-4926-9d30-c66d6e54eab9" containerID="97d4ecb0cfdecfbeda64e76a3f5a2165131d1f4745e931d7688c086c26256aaf" exitCode=0 Sep 29 20:17:19 crc kubenswrapper[4780]: I0929 20:17:19.052484 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-99rb4" event={"ID":"e75b087b-b8e3-4926-9d30-c66d6e54eab9","Type":"ContainerDied","Data":"97d4ecb0cfdecfbeda64e76a3f5a2165131d1f4745e931d7688c086c26256aaf"} Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.181458 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.225197 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cn2gq/crc-debug-99rb4"] Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.232533 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cn2gq/crc-debug-99rb4"] Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.366512 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e75b087b-b8e3-4926-9d30-c66d6e54eab9-host\") pod \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\" (UID: \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\") " Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.367241 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57nqb\" (UniqueName: \"kubernetes.io/projected/e75b087b-b8e3-4926-9d30-c66d6e54eab9-kube-api-access-57nqb\") pod \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\" (UID: \"e75b087b-b8e3-4926-9d30-c66d6e54eab9\") " Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.366911 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e75b087b-b8e3-4926-9d30-c66d6e54eab9-host" (OuterVolumeSpecName: "host") pod "e75b087b-b8e3-4926-9d30-c66d6e54eab9" (UID: "e75b087b-b8e3-4926-9d30-c66d6e54eab9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.368347 4780 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e75b087b-b8e3-4926-9d30-c66d6e54eab9-host\") on node \"crc\" DevicePath \"\"" Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.372890 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75b087b-b8e3-4926-9d30-c66d6e54eab9-kube-api-access-57nqb" (OuterVolumeSpecName: "kube-api-access-57nqb") pod "e75b087b-b8e3-4926-9d30-c66d6e54eab9" (UID: "e75b087b-b8e3-4926-9d30-c66d6e54eab9"). InnerVolumeSpecName "kube-api-access-57nqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.470405 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57nqb\" (UniqueName: \"kubernetes.io/projected/e75b087b-b8e3-4926-9d30-c66d6e54eab9-kube-api-access-57nqb\") on node \"crc\" DevicePath \"\"" Sep 29 20:17:20 crc kubenswrapper[4780]: I0929 20:17:20.770082 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75b087b-b8e3-4926-9d30-c66d6e54eab9" path="/var/lib/kubelet/pods/e75b087b-b8e3-4926-9d30-c66d6e54eab9/volumes" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.073430 4780 scope.go:117] "RemoveContainer" containerID="97d4ecb0cfdecfbeda64e76a3f5a2165131d1f4745e931d7688c086c26256aaf" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.073624 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-99rb4" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.435976 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cn2gq/crc-debug-tp5ww"] Sep 29 20:17:21 crc kubenswrapper[4780]: E0929 20:17:21.436674 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75b087b-b8e3-4926-9d30-c66d6e54eab9" containerName="container-00" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.436688 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75b087b-b8e3-4926-9d30-c66d6e54eab9" containerName="container-00" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.436885 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75b087b-b8e3-4926-9d30-c66d6e54eab9" containerName="container-00" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.437526 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.595314 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e37fe48-e0fe-4aae-8e62-04374773029c-host\") pod \"crc-debug-tp5ww\" (UID: \"3e37fe48-e0fe-4aae-8e62-04374773029c\") " pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.595906 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4xh8\" (UniqueName: \"kubernetes.io/projected/3e37fe48-e0fe-4aae-8e62-04374773029c-kube-api-access-z4xh8\") pod \"crc-debug-tp5ww\" (UID: \"3e37fe48-e0fe-4aae-8e62-04374773029c\") " pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.697698 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e37fe48-e0fe-4aae-8e62-04374773029c-host\") pod \"crc-debug-tp5ww\" (UID: \"3e37fe48-e0fe-4aae-8e62-04374773029c\") " pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.697781 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e37fe48-e0fe-4aae-8e62-04374773029c-host\") pod \"crc-debug-tp5ww\" (UID: \"3e37fe48-e0fe-4aae-8e62-04374773029c\") " pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.697851 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4xh8\" (UniqueName: \"kubernetes.io/projected/3e37fe48-e0fe-4aae-8e62-04374773029c-kube-api-access-z4xh8\") pod \"crc-debug-tp5ww\" (UID: \"3e37fe48-e0fe-4aae-8e62-04374773029c\") " pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.718183 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4xh8\" (UniqueName: \"kubernetes.io/projected/3e37fe48-e0fe-4aae-8e62-04374773029c-kube-api-access-z4xh8\") pod \"crc-debug-tp5ww\" (UID: \"3e37fe48-e0fe-4aae-8e62-04374773029c\") " pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:21 crc kubenswrapper[4780]: I0929 20:17:21.763750 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:22 crc kubenswrapper[4780]: I0929 20:17:22.084490 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" event={"ID":"3e37fe48-e0fe-4aae-8e62-04374773029c","Type":"ContainerStarted","Data":"348613977f332e72476706bcaf7ff29d0bea83b39daf71dbee4e739068bfce1d"} Sep 29 20:17:22 crc kubenswrapper[4780]: I0929 20:17:22.084824 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" event={"ID":"3e37fe48-e0fe-4aae-8e62-04374773029c","Type":"ContainerStarted","Data":"d1001bd4635eb57e92d08a1ffc08ef9bc1803c0089776185def301f89c4e2d8a"} Sep 29 20:17:22 crc kubenswrapper[4780]: I0929 20:17:22.115948 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" podStartSLOduration=1.115924885 podStartE2EDuration="1.115924885s" podCreationTimestamp="2025-09-29 20:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:17:22.099641384 +0000 UTC m=+5642.047939458" watchObservedRunningTime="2025-09-29 20:17:22.115924885 +0000 UTC m=+5642.064222969" Sep 29 20:17:23 crc kubenswrapper[4780]: I0929 20:17:23.096059 4780 generic.go:334] "Generic (PLEG): container finished" podID="3e37fe48-e0fe-4aae-8e62-04374773029c" containerID="348613977f332e72476706bcaf7ff29d0bea83b39daf71dbee4e739068bfce1d" exitCode=0 Sep 29 20:17:23 crc kubenswrapper[4780]: I0929 20:17:23.096074 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" event={"ID":"3e37fe48-e0fe-4aae-8e62-04374773029c","Type":"ContainerDied","Data":"348613977f332e72476706bcaf7ff29d0bea83b39daf71dbee4e739068bfce1d"} Sep 29 20:17:24 crc kubenswrapper[4780]: I0929 20:17:24.177359 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:24 crc kubenswrapper[4780]: I0929 20:17:24.333074 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4xh8\" (UniqueName: \"kubernetes.io/projected/3e37fe48-e0fe-4aae-8e62-04374773029c-kube-api-access-z4xh8\") pod \"3e37fe48-e0fe-4aae-8e62-04374773029c\" (UID: \"3e37fe48-e0fe-4aae-8e62-04374773029c\") " Sep 29 20:17:24 crc kubenswrapper[4780]: I0929 20:17:24.333223 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e37fe48-e0fe-4aae-8e62-04374773029c-host\") pod \"3e37fe48-e0fe-4aae-8e62-04374773029c\" (UID: \"3e37fe48-e0fe-4aae-8e62-04374773029c\") " Sep 29 20:17:24 crc kubenswrapper[4780]: I0929 20:17:24.333399 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e37fe48-e0fe-4aae-8e62-04374773029c-host" (OuterVolumeSpecName: "host") pod "3e37fe48-e0fe-4aae-8e62-04374773029c" (UID: "3e37fe48-e0fe-4aae-8e62-04374773029c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 20:17:24 crc kubenswrapper[4780]: I0929 20:17:24.333778 4780 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e37fe48-e0fe-4aae-8e62-04374773029c-host\") on node \"crc\" DevicePath \"\"" Sep 29 20:17:24 crc kubenswrapper[4780]: I0929 20:17:24.348828 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e37fe48-e0fe-4aae-8e62-04374773029c-kube-api-access-z4xh8" (OuterVolumeSpecName: "kube-api-access-z4xh8") pod "3e37fe48-e0fe-4aae-8e62-04374773029c" (UID: "3e37fe48-e0fe-4aae-8e62-04374773029c"). InnerVolumeSpecName "kube-api-access-z4xh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:17:24 crc kubenswrapper[4780]: I0929 20:17:24.434690 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4xh8\" (UniqueName: \"kubernetes.io/projected/3e37fe48-e0fe-4aae-8e62-04374773029c-kube-api-access-z4xh8\") on node \"crc\" DevicePath \"\"" Sep 29 20:17:25 crc kubenswrapper[4780]: I0929 20:17:25.110644 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" event={"ID":"3e37fe48-e0fe-4aae-8e62-04374773029c","Type":"ContainerDied","Data":"d1001bd4635eb57e92d08a1ffc08ef9bc1803c0089776185def301f89c4e2d8a"} Sep 29 20:17:25 crc kubenswrapper[4780]: I0929 20:17:25.110679 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1001bd4635eb57e92d08a1ffc08ef9bc1803c0089776185def301f89c4e2d8a" Sep 29 20:17:25 crc kubenswrapper[4780]: I0929 20:17:25.110726 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-tp5ww" Sep 29 20:17:28 crc kubenswrapper[4780]: I0929 20:17:28.348259 4780 scope.go:117] "RemoveContainer" containerID="16275540018b8a60a11155bd2fa9fd66463ad312af6c0ee038dc92e0317425ec" Sep 29 20:17:29 crc kubenswrapper[4780]: I0929 20:17:29.273498 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cn2gq/crc-debug-tp5ww"] Sep 29 20:17:29 crc kubenswrapper[4780]: I0929 20:17:29.284244 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cn2gq/crc-debug-tp5ww"] Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.483406 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cn2gq/crc-debug-jnqqh"] Sep 29 20:17:30 crc kubenswrapper[4780]: E0929 20:17:30.485749 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e37fe48-e0fe-4aae-8e62-04374773029c" containerName="container-00" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.486013 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e37fe48-e0fe-4aae-8e62-04374773029c" containerName="container-00" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.486680 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e37fe48-e0fe-4aae-8e62-04374773029c" containerName="container-00" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.488292 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.641943 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35775394-3242-4a16-ae8a-1a8cb381baaf-host\") pod \"crc-debug-jnqqh\" (UID: \"35775394-3242-4a16-ae8a-1a8cb381baaf\") " pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.642650 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fm59\" (UniqueName: \"kubernetes.io/projected/35775394-3242-4a16-ae8a-1a8cb381baaf-kube-api-access-7fm59\") pod \"crc-debug-jnqqh\" (UID: \"35775394-3242-4a16-ae8a-1a8cb381baaf\") " pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.745135 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fm59\" (UniqueName: \"kubernetes.io/projected/35775394-3242-4a16-ae8a-1a8cb381baaf-kube-api-access-7fm59\") pod \"crc-debug-jnqqh\" (UID: \"35775394-3242-4a16-ae8a-1a8cb381baaf\") " pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.745280 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35775394-3242-4a16-ae8a-1a8cb381baaf-host\") pod \"crc-debug-jnqqh\" (UID: \"35775394-3242-4a16-ae8a-1a8cb381baaf\") " pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.745421 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35775394-3242-4a16-ae8a-1a8cb381baaf-host\") pod \"crc-debug-jnqqh\" (UID: \"35775394-3242-4a16-ae8a-1a8cb381baaf\") " pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.774206 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e37fe48-e0fe-4aae-8e62-04374773029c" path="/var/lib/kubelet/pods/3e37fe48-e0fe-4aae-8e62-04374773029c/volumes" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.781757 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fm59\" (UniqueName: \"kubernetes.io/projected/35775394-3242-4a16-ae8a-1a8cb381baaf-kube-api-access-7fm59\") pod \"crc-debug-jnqqh\" (UID: \"35775394-3242-4a16-ae8a-1a8cb381baaf\") " pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:30 crc kubenswrapper[4780]: I0929 20:17:30.822863 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:31 crc kubenswrapper[4780]: I0929 20:17:31.169083 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" event={"ID":"35775394-3242-4a16-ae8a-1a8cb381baaf","Type":"ContainerStarted","Data":"a31ba9846b3194275d4e00d07e731ec3c9bb3318f993713103c4ad995f9f6f3a"} Sep 29 20:17:31 crc kubenswrapper[4780]: I0929 20:17:31.169491 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" event={"ID":"35775394-3242-4a16-ae8a-1a8cb381baaf","Type":"ContainerStarted","Data":"b08bf8597a28440f1b5b8c550769b03cd8555617cc3994924caabb4018ab3daa"} Sep 29 20:17:31 crc kubenswrapper[4780]: I0929 20:17:31.190509 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" podStartSLOduration=1.190489878 podStartE2EDuration="1.190489878s" podCreationTimestamp="2025-09-29 20:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-29 20:17:31.186985188 +0000 UTC m=+5651.135283252" watchObservedRunningTime="2025-09-29 20:17:31.190489878 +0000 UTC m=+5651.138787922" Sep 29 20:17:32 crc kubenswrapper[4780]: I0929 20:17:32.182814 4780 generic.go:334] "Generic (PLEG): container finished" podID="35775394-3242-4a16-ae8a-1a8cb381baaf" containerID="a31ba9846b3194275d4e00d07e731ec3c9bb3318f993713103c4ad995f9f6f3a" exitCode=0 Sep 29 20:17:32 crc kubenswrapper[4780]: I0929 20:17:32.182939 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" event={"ID":"35775394-3242-4a16-ae8a-1a8cb381baaf","Type":"ContainerDied","Data":"a31ba9846b3194275d4e00d07e731ec3c9bb3318f993713103c4ad995f9f6f3a"} Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.222982 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.223091 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.301943 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.341033 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cn2gq/crc-debug-jnqqh"] Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.351040 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cn2gq/crc-debug-jnqqh"] Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.399582 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35775394-3242-4a16-ae8a-1a8cb381baaf-host\") pod \"35775394-3242-4a16-ae8a-1a8cb381baaf\" (UID: \"35775394-3242-4a16-ae8a-1a8cb381baaf\") " Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.399819 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fm59\" (UniqueName: \"kubernetes.io/projected/35775394-3242-4a16-ae8a-1a8cb381baaf-kube-api-access-7fm59\") pod \"35775394-3242-4a16-ae8a-1a8cb381baaf\" (UID: \"35775394-3242-4a16-ae8a-1a8cb381baaf\") " Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.399907 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35775394-3242-4a16-ae8a-1a8cb381baaf-host" (OuterVolumeSpecName: "host") pod "35775394-3242-4a16-ae8a-1a8cb381baaf" (UID: "35775394-3242-4a16-ae8a-1a8cb381baaf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.400410 4780 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35775394-3242-4a16-ae8a-1a8cb381baaf-host\") on node \"crc\" DevicePath \"\"" Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.409355 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35775394-3242-4a16-ae8a-1a8cb381baaf-kube-api-access-7fm59" (OuterVolumeSpecName: "kube-api-access-7fm59") pod "35775394-3242-4a16-ae8a-1a8cb381baaf" (UID: "35775394-3242-4a16-ae8a-1a8cb381baaf"). InnerVolumeSpecName "kube-api-access-7fm59". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:17:33 crc kubenswrapper[4780]: I0929 20:17:33.503252 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fm59\" (UniqueName: \"kubernetes.io/projected/35775394-3242-4a16-ae8a-1a8cb381baaf-kube-api-access-7fm59\") on node \"crc\" DevicePath \"\"" Sep 29 20:17:34 crc kubenswrapper[4780]: I0929 20:17:34.203951 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b08bf8597a28440f1b5b8c550769b03cd8555617cc3994924caabb4018ab3daa" Sep 29 20:17:34 crc kubenswrapper[4780]: I0929 20:17:34.203993 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/crc-debug-jnqqh" Sep 29 20:17:34 crc kubenswrapper[4780]: I0929 20:17:34.762453 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35775394-3242-4a16-ae8a-1a8cb381baaf" path="/var/lib/kubelet/pods/35775394-3242-4a16-ae8a-1a8cb381baaf/volumes" Sep 29 20:17:34 crc kubenswrapper[4780]: I0929 20:17:34.941205 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7_3ced0c80-3dc4-4f78-95d1-eaaa021aad95/util/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.048597 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7_3ced0c80-3dc4-4f78-95d1-eaaa021aad95/util/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.081334 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7_3ced0c80-3dc4-4f78-95d1-eaaa021aad95/pull/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.135767 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7_3ced0c80-3dc4-4f78-95d1-eaaa021aad95/pull/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.263689 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7_3ced0c80-3dc4-4f78-95d1-eaaa021aad95/pull/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.287759 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7_3ced0c80-3dc4-4f78-95d1-eaaa021aad95/util/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.290123 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5ffb62280253d85b2af235b10999d22be8e5524c9accad58ab6091a469z5tt7_3ced0c80-3dc4-4f78-95d1-eaaa021aad95/extract/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.441190 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-qm8gn_f488a5b4-5b60-4e98-9095-5c6b3e7d580b/kube-rbac-proxy/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.529684 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-qm8gn_f488a5b4-5b60-4e98-9095-5c6b3e7d580b/manager/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.561444 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-n9smm_8add36ee-ae48-47aa-a1b8-39e26a2b61c4/kube-rbac-proxy/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.663175 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-n9smm_8add36ee-ae48-47aa-a1b8-39e26a2b61c4/manager/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.701067 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-xhck5_afe8c052-ff7e-4892-81fa-8045f69346eb/kube-rbac-proxy/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.776373 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-xhck5_afe8c052-ff7e-4892-81fa-8045f69346eb/manager/0.log" Sep 29 20:17:35 crc kubenswrapper[4780]: I0929 20:17:35.903942 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-r4g5l_85948289-f8ff-4ccb-8322-17c68d0ca529/kube-rbac-proxy/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.000490 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-r4g5l_85948289-f8ff-4ccb-8322-17c68d0ca529/manager/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.025296 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-f2xqf_de762ea1-08cb-48cd-8e29-2d7523a63ef8/kube-rbac-proxy/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.110905 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-f2xqf_de762ea1-08cb-48cd-8e29-2d7523a63ef8/manager/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.157883 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-7ktf6_53223fa3-3901-4f53-9c6b-18e07485a7ad/kube-rbac-proxy/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.231463 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-7ktf6_53223fa3-3901-4f53-9c6b-18e07485a7ad/manager/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.389197 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c7d9477-jzhjc_cd74a7d5-36fa-4c53-b9a6-9f9a733791d5/kube-rbac-proxy/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.570171 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f589bc7f7-zfhtk_0dffef5d-ec0f-4e39-a948-c670be2a8521/kube-rbac-proxy/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.587014 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c7d9477-jzhjc_cd74a7d5-36fa-4c53-b9a6-9f9a733791d5/manager/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.634254 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f589bc7f7-zfhtk_0dffef5d-ec0f-4e39-a948-c670be2a8521/manager/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.766991 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-lssxn_40a3e409-3dbc-4936-819f-c64fe007d584/kube-rbac-proxy/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.856247 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-lssxn_40a3e409-3dbc-4936-819f-c64fe007d584/manager/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.906168 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-slhwp_1b274b66-59c6-49e6-8469-dfaa9d5a85cc/kube-rbac-proxy/0.log" Sep 29 20:17:36 crc kubenswrapper[4780]: I0929 20:17:36.960804 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-slhwp_1b274b66-59c6-49e6-8469-dfaa9d5a85cc/manager/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.098567 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-46wcs_28049dad-f386-4b21-b525-63fd463b8c37/kube-rbac-proxy/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.110099 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-46wcs_28049dad-f386-4b21-b525-63fd463b8c37/manager/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.174296 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b96467f46-lfnp5_6865eded-097c-49c7-a54d-cda27a2adc65/kube-rbac-proxy/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.303933 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b96467f46-lfnp5_6865eded-097c-49c7-a54d-cda27a2adc65/manager/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.372252 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79f9fc9fd8-7mc4n_eabb644f-cfed-402e-8e6c-b98dc6ec30ef/kube-rbac-proxy/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.455171 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79f9fc9fd8-7mc4n_eabb644f-cfed-402e-8e6c-b98dc6ec30ef/manager/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.507289 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fb7d6b8bf-dfxwb_23713df8-910e-453e-a639-cdfc43473071/kube-rbac-proxy/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.558956 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fb7d6b8bf-dfxwb_23713df8-910e-453e-a639-cdfc43473071/manager/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.668943 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh_dcc90bb2-08d8-448b-85bb-955bfc3a7371/manager/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.678211 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86b7cb4c5f5gpsh_dcc90bb2-08d8-448b-85bb-955bfc3a7371/kube-rbac-proxy/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.777645 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b7bb8bd67-s9lgn_18dfa3ae-5e34-436b-87b9-f215e898567c/kube-rbac-proxy/0.log" Sep 29 20:17:37 crc kubenswrapper[4780]: I0929 20:17:37.931917 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56dc567787-f9qwv_727c9941-0992-4a9c-8c56-ffa31bb24cf4/kube-rbac-proxy/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.136278 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56dc567787-f9qwv_727c9941-0992-4a9c-8c56-ffa31bb24cf4/operator/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.198672 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hrkrx_8cb0a136-583a-4403-924a-bdedd686f874/registry-server/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.359242 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-87km7_cec435bb-5818-41aa-8177-dfdddc267c00/kube-rbac-proxy/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.447909 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-d8s8d_541588f6-d71c-42ca-b4eb-515f5409f2d1/kube-rbac-proxy/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.468628 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-87km7_cec435bb-5818-41aa-8177-dfdddc267c00/manager/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.610195 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-d8s8d_541588f6-d71c-42ca-b4eb-515f5409f2d1/manager/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.676203 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-j92fn_e6b57f8b-1be2-48b1-be60-30a3583f6052/operator/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.743421 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7b7bb8bd67-s9lgn_18dfa3ae-5e34-436b-87b9-f215e898567c/manager/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.809944 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-657c6b68c7-625qh_ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a/kube-rbac-proxy/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.827214 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-657c6b68c7-625qh_ae1f68b1-09cb-4e07-b26d-fa895e1f2a1a/manager/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.909043 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-nznlf_e6865432-79dd-4823-a42d-bb08417a0f90/kube-rbac-proxy/0.log" Sep 29 20:17:38 crc kubenswrapper[4780]: I0929 20:17:38.988497 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-nznlf_e6865432-79dd-4823-a42d-bb08417a0f90/manager/0.log" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.046439 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb97fcf96-szxrn_02f6d355-f384-4b36-b518-55ad38e66215/kube-rbac-proxy/0.log" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.071241 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb97fcf96-szxrn_02f6d355-f384-4b36-b518-55ad38e66215/manager/0.log" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.169742 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75756dd4d9-ksn7s_8fceb492-ba01-4e2f-b59b-6557da4e851a/kube-rbac-proxy/0.log" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.180213 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75756dd4d9-ksn7s_8fceb492-ba01-4e2f-b59b-6557da4e851a/manager/0.log" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.248006 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l787g"] Sep 29 20:17:39 crc kubenswrapper[4780]: E0929 20:17:39.248344 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35775394-3242-4a16-ae8a-1a8cb381baaf" containerName="container-00" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.248356 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="35775394-3242-4a16-ae8a-1a8cb381baaf" containerName="container-00" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.248529 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="35775394-3242-4a16-ae8a-1a8cb381baaf" containerName="container-00" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.249776 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.273194 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l787g"] Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.395736 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-utilities\") pod \"redhat-marketplace-l787g\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.395852 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44j8g\" (UniqueName: \"kubernetes.io/projected/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-kube-api-access-44j8g\") pod \"redhat-marketplace-l787g\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.396071 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-catalog-content\") pod \"redhat-marketplace-l787g\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.497572 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-utilities\") pod \"redhat-marketplace-l787g\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.497682 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44j8g\" (UniqueName: \"kubernetes.io/projected/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-kube-api-access-44j8g\") pod \"redhat-marketplace-l787g\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.497730 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-catalog-content\") pod \"redhat-marketplace-l787g\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.498334 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-utilities\") pod \"redhat-marketplace-l787g\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.498383 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-catalog-content\") pod \"redhat-marketplace-l787g\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.530795 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44j8g\" (UniqueName: \"kubernetes.io/projected/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-kube-api-access-44j8g\") pod \"redhat-marketplace-l787g\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:39 crc kubenswrapper[4780]: I0929 20:17:39.589977 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:40 crc kubenswrapper[4780]: I0929 20:17:40.104961 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l787g"] Sep 29 20:17:40 crc kubenswrapper[4780]: I0929 20:17:40.267147 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l787g" event={"ID":"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169","Type":"ContainerStarted","Data":"f442ef096b809f8f0ee0660b8f80a36bfac0cf856f4b8d7309f96fc0e04bffeb"} Sep 29 20:17:41 crc kubenswrapper[4780]: I0929 20:17:41.278367 4780 generic.go:334] "Generic (PLEG): container finished" podID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerID="ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b" exitCode=0 Sep 29 20:17:41 crc kubenswrapper[4780]: I0929 20:17:41.278449 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l787g" event={"ID":"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169","Type":"ContainerDied","Data":"ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b"} Sep 29 20:17:41 crc kubenswrapper[4780]: I0929 20:17:41.283169 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 29 20:17:43 crc kubenswrapper[4780]: I0929 20:17:43.298963 4780 generic.go:334] "Generic (PLEG): container finished" podID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerID="ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a" exitCode=0 Sep 29 20:17:43 crc kubenswrapper[4780]: I0929 20:17:43.299114 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l787g" event={"ID":"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169","Type":"ContainerDied","Data":"ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a"} Sep 29 20:17:44 crc kubenswrapper[4780]: I0929 20:17:44.310530 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l787g" event={"ID":"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169","Type":"ContainerStarted","Data":"0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e"} Sep 29 20:17:44 crc kubenswrapper[4780]: I0929 20:17:44.330583 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l787g" podStartSLOduration=2.887092163 podStartE2EDuration="5.330568527s" podCreationTimestamp="2025-09-29 20:17:39 +0000 UTC" firstStartedPulling="2025-09-29 20:17:41.282896691 +0000 UTC m=+5661.231194745" lastFinishedPulling="2025-09-29 20:17:43.726373045 +0000 UTC m=+5663.674671109" observedRunningTime="2025-09-29 20:17:44.32575149 +0000 UTC m=+5664.274049534" watchObservedRunningTime="2025-09-29 20:17:44.330568527 +0000 UTC m=+5664.278866571" Sep 29 20:17:49 crc kubenswrapper[4780]: I0929 20:17:49.590274 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:49 crc kubenswrapper[4780]: I0929 20:17:49.590688 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:49 crc kubenswrapper[4780]: I0929 20:17:49.652391 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:50 crc kubenswrapper[4780]: I0929 20:17:50.416237 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:50 crc kubenswrapper[4780]: I0929 20:17:50.469423 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l787g"] Sep 29 20:17:52 crc kubenswrapper[4780]: I0929 20:17:52.377941 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l787g" podUID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerName="registry-server" containerID="cri-o://0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e" gracePeriod=2 Sep 29 20:17:52 crc kubenswrapper[4780]: I0929 20:17:52.893656 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.059915 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-catalog-content\") pod \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.060078 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-utilities\") pod \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.060203 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44j8g\" (UniqueName: \"kubernetes.io/projected/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-kube-api-access-44j8g\") pod \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\" (UID: \"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169\") " Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.061474 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-utilities" (OuterVolumeSpecName: "utilities") pod "b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" (UID: "b9ca6cd4-2ac2-4357-a5e7-848fd0a92169"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.073282 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-kube-api-access-44j8g" (OuterVolumeSpecName: "kube-api-access-44j8g") pod "b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" (UID: "b9ca6cd4-2ac2-4357-a5e7-848fd0a92169"). InnerVolumeSpecName "kube-api-access-44j8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.076669 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" (UID: "b9ca6cd4-2ac2-4357-a5e7-848fd0a92169"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.162137 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44j8g\" (UniqueName: \"kubernetes.io/projected/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-kube-api-access-44j8g\") on node \"crc\" DevicePath \"\"" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.162175 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.162189 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.393911 4780 generic.go:334] "Generic (PLEG): container finished" podID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerID="0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e" exitCode=0 Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.393972 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l787g" event={"ID":"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169","Type":"ContainerDied","Data":"0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e"} Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.394082 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l787g" event={"ID":"b9ca6cd4-2ac2-4357-a5e7-848fd0a92169","Type":"ContainerDied","Data":"f442ef096b809f8f0ee0660b8f80a36bfac0cf856f4b8d7309f96fc0e04bffeb"} Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.394077 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l787g" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.394146 4780 scope.go:117] "RemoveContainer" containerID="0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.414468 4780 scope.go:117] "RemoveContainer" containerID="ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.446768 4780 scope.go:117] "RemoveContainer" containerID="ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.452203 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l787g"] Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.460040 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l787g"] Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.484732 4780 scope.go:117] "RemoveContainer" containerID="0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e" Sep 29 20:17:53 crc kubenswrapper[4780]: E0929 20:17:53.485249 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e\": container with ID starting with 0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e not found: ID does not exist" containerID="0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.485289 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e"} err="failed to get container status \"0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e\": rpc error: code = NotFound desc = could not find container \"0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e\": container with ID starting with 0f239856e52879b2dd2518e9a432c8ed5d6eb641939bd0e30f2f083c49dd466e not found: ID does not exist" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.485317 4780 scope.go:117] "RemoveContainer" containerID="ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a" Sep 29 20:17:53 crc kubenswrapper[4780]: E0929 20:17:53.485562 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a\": container with ID starting with ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a not found: ID does not exist" containerID="ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.485586 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a"} err="failed to get container status \"ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a\": rpc error: code = NotFound desc = could not find container \"ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a\": container with ID starting with ca435232a29f56eeb5584f9f389e9076a741c3d037c607d77f7cac4c3536f76a not found: ID does not exist" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.485604 4780 scope.go:117] "RemoveContainer" containerID="ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b" Sep 29 20:17:53 crc kubenswrapper[4780]: E0929 20:17:53.485936 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b\": container with ID starting with ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b not found: ID does not exist" containerID="ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b" Sep 29 20:17:53 crc kubenswrapper[4780]: I0929 20:17:53.485957 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b"} err="failed to get container status \"ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b\": rpc error: code = NotFound desc = could not find container \"ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b\": container with ID starting with ee544c8a0b5363bdb3bebe62a5e28beca4dc43d0d62c05781b12e0ba36811b0b not found: ID does not exist" Sep 29 20:17:54 crc kubenswrapper[4780]: I0929 20:17:54.783394 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" path="/var/lib/kubelet/pods/b9ca6cd4-2ac2-4357-a5e7-848fd0a92169/volumes" Sep 29 20:17:55 crc kubenswrapper[4780]: I0929 20:17:55.762111 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mkxzc_83670b30-2222-428b-b4cc-17d16e0bedb2/control-plane-machine-set-operator/0.log" Sep 29 20:17:55 crc kubenswrapper[4780]: I0929 20:17:55.898385 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-d4z2z_0c169409-7ddd-4961-b837-847550878691/kube-rbac-proxy/0.log" Sep 29 20:17:55 crc kubenswrapper[4780]: I0929 20:17:55.944787 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-d4z2z_0c169409-7ddd-4961-b837-847550878691/machine-api-operator/0.log" Sep 29 20:18:03 crc kubenswrapper[4780]: I0929 20:18:03.223233 4780 patch_prober.go:28] interesting pod/machine-config-daemon-jrs9w container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 29 20:18:03 crc kubenswrapper[4780]: I0929 20:18:03.225105 4780 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 29 20:18:03 crc kubenswrapper[4780]: I0929 20:18:03.225238 4780 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" Sep 29 20:18:03 crc kubenswrapper[4780]: I0929 20:18:03.226098 4780 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba"} pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 29 20:18:03 crc kubenswrapper[4780]: I0929 20:18:03.226271 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" containerName="machine-config-daemon" containerID="cri-o://930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" gracePeriod=600 Sep 29 20:18:03 crc kubenswrapper[4780]: E0929 20:18:03.357965 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:18:03 crc kubenswrapper[4780]: I0929 20:18:03.494600 4780 generic.go:334] "Generic (PLEG): container finished" podID="67a6d63c-6762-464e-9216-a234506b74db" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" exitCode=0 Sep 29 20:18:03 crc kubenswrapper[4780]: I0929 20:18:03.494698 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" event={"ID":"67a6d63c-6762-464e-9216-a234506b74db","Type":"ContainerDied","Data":"930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba"} Sep 29 20:18:03 crc kubenswrapper[4780]: I0929 20:18:03.495333 4780 scope.go:117] "RemoveContainer" containerID="3c7d0867cfa2f7173f305d5b92aa3ffce4dd0a0e42d21fcd3573872eb7ac90e5" Sep 29 20:18:03 crc kubenswrapper[4780]: I0929 20:18:03.496009 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:18:03 crc kubenswrapper[4780]: E0929 20:18:03.496752 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:18:09 crc kubenswrapper[4780]: I0929 20:18:09.044148 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-lwn5c_bcdc02e7-5830-4c51-abac-696d1744137e/cert-manager-controller/0.log" Sep 29 20:18:09 crc kubenswrapper[4780]: I0929 20:18:09.225678 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-p2mzw_6e73bac5-a163-4f7e-a30d-31f4e9fc9b9a/cert-manager-cainjector/0.log" Sep 29 20:18:09 crc kubenswrapper[4780]: I0929 20:18:09.254317 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-4hvmj_b94c3657-f10a-48a3-a551-696d692130cb/cert-manager-webhook/0.log" Sep 29 20:18:15 crc kubenswrapper[4780]: I0929 20:18:15.753560 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:18:15 crc kubenswrapper[4780]: E0929 20:18:15.754404 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:18:22 crc kubenswrapper[4780]: I0929 20:18:22.282706 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-xq6qz_aca73d30-931f-40bb-8af6-ca484c734840/nmstate-console-plugin/0.log" Sep 29 20:18:22 crc kubenswrapper[4780]: I0929 20:18:22.425227 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tbxfv_8630f421-2559-4a68-9f18-4eed4e760add/kube-rbac-proxy/0.log" Sep 29 20:18:22 crc kubenswrapper[4780]: I0929 20:18:22.449625 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zdk8n_cc3b4e95-a14f-4ba5-a5dd-2d6fc3e9cc7a/nmstate-handler/0.log" Sep 29 20:18:22 crc kubenswrapper[4780]: I0929 20:18:22.523110 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tbxfv_8630f421-2559-4a68-9f18-4eed4e760add/nmstate-metrics/0.log" Sep 29 20:18:22 crc kubenswrapper[4780]: I0929 20:18:22.620673 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-qtlkq_c110df1d-a352-4af1-9b48-3d68bd11f230/nmstate-operator/0.log" Sep 29 20:18:22 crc kubenswrapper[4780]: I0929 20:18:22.742719 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-dq26q_bd2d744d-8c52-4b60-a8e0-95999db053fc/nmstate-webhook/0.log" Sep 29 20:18:28 crc kubenswrapper[4780]: I0929 20:18:28.421170 4780 scope.go:117] "RemoveContainer" containerID="b877db94dcd31c3529579743d199f56e31bd63324ba265c6a7aecb462d66c9ad" Sep 29 20:18:28 crc kubenswrapper[4780]: I0929 20:18:28.462129 4780 scope.go:117] "RemoveContainer" containerID="ee6742837406809c4437c243810da7ad1b7b51048e989b6ee08b5648ffdfff5f" Sep 29 20:18:28 crc kubenswrapper[4780]: I0929 20:18:28.513287 4780 scope.go:117] "RemoveContainer" containerID="e9ab707808f5be1285d3889954d76ee347f1b0c9307204680ca56cecb158310b" Sep 29 20:18:28 crc kubenswrapper[4780]: I0929 20:18:28.759272 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:18:28 crc kubenswrapper[4780]: E0929 20:18:28.759530 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:18:37 crc kubenswrapper[4780]: I0929 20:18:37.494848 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-zchw7_dd6a4e23-fac8-4a60-8aed-541962416e4a/kube-rbac-proxy/0.log" Sep 29 20:18:37 crc kubenswrapper[4780]: I0929 20:18:37.680254 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-frr-files/0.log" Sep 29 20:18:37 crc kubenswrapper[4780]: I0929 20:18:37.824344 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-zchw7_dd6a4e23-fac8-4a60-8aed-541962416e4a/controller/0.log" Sep 29 20:18:37 crc kubenswrapper[4780]: I0929 20:18:37.887529 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-frr-files/0.log" Sep 29 20:18:37 crc kubenswrapper[4780]: I0929 20:18:37.902746 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-reloader/0.log" Sep 29 20:18:37 crc kubenswrapper[4780]: I0929 20:18:37.915266 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-metrics/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.013258 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-reloader/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.171683 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-reloader/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.184994 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-metrics/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.190688 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-frr-files/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.191516 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-metrics/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.404010 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-frr-files/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.416557 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-metrics/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.416582 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/cp-reloader/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.421540 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/controller/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.581177 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/frr-metrics/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.620505 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/kube-rbac-proxy/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.641970 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/kube-rbac-proxy-frr/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.734435 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/reloader/0.log" Sep 29 20:18:38 crc kubenswrapper[4780]: I0929 20:18:38.835706 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-zwqlr_5ab4aafd-d570-4892-ac9c-75a4cb18b1af/frr-k8s-webhook-server/0.log" Sep 29 20:18:39 crc kubenswrapper[4780]: I0929 20:18:39.077701 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-56b5b4fb49-hrxkp_ce41758d-ee19-4ea0-a3b2-986d2f51d538/manager/0.log" Sep 29 20:18:39 crc kubenswrapper[4780]: I0929 20:18:39.222453 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-b5dc64569-rr9r9_94b2302d-6d22-4d24-8186-5707782a3cb6/webhook-server/0.log" Sep 29 20:18:39 crc kubenswrapper[4780]: I0929 20:18:39.271465 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4w6qq_758227bc-a5d7-4889-a3fa-0b97b5212c6f/kube-rbac-proxy/0.log" Sep 29 20:18:39 crc kubenswrapper[4780]: I0929 20:18:39.919529 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4w6qq_758227bc-a5d7-4889-a3fa-0b97b5212c6f/speaker/0.log" Sep 29 20:18:40 crc kubenswrapper[4780]: I0929 20:18:40.212310 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-flc8q_390f974b-7808-487b-95b8-b72df5367294/frr/0.log" Sep 29 20:18:41 crc kubenswrapper[4780]: I0929 20:18:41.754330 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:18:41 crc kubenswrapper[4780]: E0929 20:18:41.754730 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.153038 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp_432abf3f-794c-4850-88e9-b1d509c9dd42/util/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.311995 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp_432abf3f-794c-4850-88e9-b1d509c9dd42/util/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.334490 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp_432abf3f-794c-4850-88e9-b1d509c9dd42/pull/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.337095 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp_432abf3f-794c-4850-88e9-b1d509c9dd42/pull/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.487596 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp_432abf3f-794c-4850-88e9-b1d509c9dd42/pull/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.507776 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp_432abf3f-794c-4850-88e9-b1d509c9dd42/extract/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.511534 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69psvjp_432abf3f-794c-4850-88e9-b1d509c9dd42/util/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.658801 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr_eca29428-21f3-4e84-81b2-a42e23d90c23/util/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.754199 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:18:53 crc kubenswrapper[4780]: E0929 20:18:53.776653 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.875781 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr_eca29428-21f3-4e84-81b2-a42e23d90c23/pull/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.877779 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr_eca29428-21f3-4e84-81b2-a42e23d90c23/util/0.log" Sep 29 20:18:53 crc kubenswrapper[4780]: I0929 20:18:53.898243 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr_eca29428-21f3-4e84-81b2-a42e23d90c23/pull/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.041041 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr_eca29428-21f3-4e84-81b2-a42e23d90c23/util/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.073382 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr_eca29428-21f3-4e84-81b2-a42e23d90c23/extract/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.099753 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcntztr_eca29428-21f3-4e84-81b2-a42e23d90c23/pull/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.266904 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4fggl_6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84/extract-utilities/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.389222 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4fggl_6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84/extract-utilities/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.396888 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4fggl_6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84/extract-content/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.407802 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4fggl_6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84/extract-content/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.539465 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4fggl_6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84/extract-content/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.588036 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4fggl_6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84/extract-utilities/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.723196 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6bd54_5059373a-528f-485b-afbe-2bd945289b0b/extract-utilities/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.947019 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6bd54_5059373a-528f-485b-afbe-2bd945289b0b/extract-utilities/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.972669 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4fggl_6a146d9b-2d85-4e9e-a76c-50cf4bfbbe84/registry-server/0.log" Sep 29 20:18:54 crc kubenswrapper[4780]: I0929 20:18:54.973036 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6bd54_5059373a-528f-485b-afbe-2bd945289b0b/extract-content/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.020258 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6bd54_5059373a-528f-485b-afbe-2bd945289b0b/extract-content/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.093439 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6bd54_5059373a-528f-485b-afbe-2bd945289b0b/extract-utilities/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.122177 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6bd54_5059373a-528f-485b-afbe-2bd945289b0b/extract-content/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.332607 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn_fad3af9b-342c-4ae5-b607-5efaaf0a9a05/util/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.551470 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn_fad3af9b-342c-4ae5-b607-5efaaf0a9a05/pull/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.553212 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn_fad3af9b-342c-4ae5-b607-5efaaf0a9a05/util/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.592799 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn_fad3af9b-342c-4ae5-b607-5efaaf0a9a05/pull/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.722901 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn_fad3af9b-342c-4ae5-b607-5efaaf0a9a05/pull/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.738596 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6bd54_5059373a-528f-485b-afbe-2bd945289b0b/registry-server/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.762971 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn_fad3af9b-342c-4ae5-b607-5efaaf0a9a05/util/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.796553 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d966dqpn_fad3af9b-342c-4ae5-b607-5efaaf0a9a05/extract/0.log" Sep 29 20:18:55 crc kubenswrapper[4780]: I0929 20:18:55.928249 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ld5pv_b6cfd83e-4c6c-4e46-8981-81d25b08d81e/marketplace-operator/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.038313 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-68x8k_b0dc84e5-abe0-4e53-813c-0363cf9de12f/extract-utilities/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.148995 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-68x8k_b0dc84e5-abe0-4e53-813c-0363cf9de12f/extract-utilities/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.179917 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-68x8k_b0dc84e5-abe0-4e53-813c-0363cf9de12f/extract-content/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.194912 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-68x8k_b0dc84e5-abe0-4e53-813c-0363cf9de12f/extract-content/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.408319 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-68x8k_b0dc84e5-abe0-4e53-813c-0363cf9de12f/extract-content/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.413711 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-68x8k_b0dc84e5-abe0-4e53-813c-0363cf9de12f/extract-utilities/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.469368 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xzf5k_5ff6fc42-ccc4-4d32-8699-0ad29962b340/extract-utilities/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.565167 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-68x8k_b0dc84e5-abe0-4e53-813c-0363cf9de12f/registry-server/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.656152 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xzf5k_5ff6fc42-ccc4-4d32-8699-0ad29962b340/extract-content/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.677003 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xzf5k_5ff6fc42-ccc4-4d32-8699-0ad29962b340/extract-utilities/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.701774 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xzf5k_5ff6fc42-ccc4-4d32-8699-0ad29962b340/extract-content/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.857741 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xzf5k_5ff6fc42-ccc4-4d32-8699-0ad29962b340/extract-utilities/0.log" Sep 29 20:18:56 crc kubenswrapper[4780]: I0929 20:18:56.875969 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xzf5k_5ff6fc42-ccc4-4d32-8699-0ad29962b340/extract-content/0.log" Sep 29 20:18:57 crc kubenswrapper[4780]: I0929 20:18:57.442059 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xzf5k_5ff6fc42-ccc4-4d32-8699-0ad29962b340/registry-server/0.log" Sep 29 20:19:08 crc kubenswrapper[4780]: I0929 20:19:08.753620 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:19:08 crc kubenswrapper[4780]: E0929 20:19:08.754691 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:19:22 crc kubenswrapper[4780]: E0929 20:19:22.262887 4780 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.80:46110->38.102.83.80:37067: write tcp 38.102.83.80:46110->38.102.83.80:37067: write: broken pipe Sep 29 20:19:23 crc kubenswrapper[4780]: I0929 20:19:23.753572 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:19:23 crc kubenswrapper[4780]: E0929 20:19:23.754615 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:19:36 crc kubenswrapper[4780]: I0929 20:19:36.753222 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:19:36 crc kubenswrapper[4780]: E0929 20:19:36.753835 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:19:48 crc kubenswrapper[4780]: I0929 20:19:48.754415 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:19:48 crc kubenswrapper[4780]: E0929 20:19:48.756784 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:20:02 crc kubenswrapper[4780]: I0929 20:20:02.753859 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:20:02 crc kubenswrapper[4780]: E0929 20:20:02.755219 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:20:14 crc kubenswrapper[4780]: I0929 20:20:14.754901 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:20:14 crc kubenswrapper[4780]: E0929 20:20:14.756450 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:20:26 crc kubenswrapper[4780]: I0929 20:20:26.753921 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:20:26 crc kubenswrapper[4780]: E0929 20:20:26.754733 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.572777 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g7lrf"] Sep 29 20:20:28 crc kubenswrapper[4780]: E0929 20:20:28.573195 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerName="registry-server" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.573212 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerName="registry-server" Sep 29 20:20:28 crc kubenswrapper[4780]: E0929 20:20:28.573244 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerName="extract-content" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.573253 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerName="extract-content" Sep 29 20:20:28 crc kubenswrapper[4780]: E0929 20:20:28.573265 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerName="extract-utilities" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.573272 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerName="extract-utilities" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.573471 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ca6cd4-2ac2-4357-a5e7-848fd0a92169" containerName="registry-server" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.574618 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.585828 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7lrf"] Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.694430 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-catalog-content\") pod \"community-operators-g7lrf\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.694558 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-utilities\") pod \"community-operators-g7lrf\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.694858 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9nwj\" (UniqueName: \"kubernetes.io/projected/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-kube-api-access-w9nwj\") pod \"community-operators-g7lrf\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.796933 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9nwj\" (UniqueName: \"kubernetes.io/projected/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-kube-api-access-w9nwj\") pod \"community-operators-g7lrf\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.797140 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-catalog-content\") pod \"community-operators-g7lrf\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.797198 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-utilities\") pod \"community-operators-g7lrf\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.798195 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-catalog-content\") pod \"community-operators-g7lrf\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.798336 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-utilities\") pod \"community-operators-g7lrf\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.827100 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9nwj\" (UniqueName: \"kubernetes.io/projected/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-kube-api-access-w9nwj\") pod \"community-operators-g7lrf\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:28 crc kubenswrapper[4780]: I0929 20:20:28.910791 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:29 crc kubenswrapper[4780]: I0929 20:20:29.401572 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7lrf"] Sep 29 20:20:29 crc kubenswrapper[4780]: I0929 20:20:29.982994 4780 generic.go:334] "Generic (PLEG): container finished" podID="0cd222f3-96a4-4cb7-975c-a056ee4d0b8d" containerID="04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911" exitCode=0 Sep 29 20:20:29 crc kubenswrapper[4780]: I0929 20:20:29.983476 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7lrf" event={"ID":"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d","Type":"ContainerDied","Data":"04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911"} Sep 29 20:20:29 crc kubenswrapper[4780]: I0929 20:20:29.983533 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7lrf" event={"ID":"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d","Type":"ContainerStarted","Data":"1e4bb5b629e5cf752eafd58dd2e572770871825f45df1f105408b1e0887596a5"} Sep 29 20:20:30 crc kubenswrapper[4780]: I0929 20:20:30.996562 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7lrf" event={"ID":"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d","Type":"ContainerStarted","Data":"7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce"} Sep 29 20:20:32 crc kubenswrapper[4780]: I0929 20:20:32.010388 4780 generic.go:334] "Generic (PLEG): container finished" podID="76413d88-7478-4108-8f6f-c63fa89eb825" containerID="eb961d32f4b3107a02d0f461f85dbb79eef539acb7d190bdf55bb5bb7215c85a" exitCode=0 Sep 29 20:20:32 crc kubenswrapper[4780]: I0929 20:20:32.010486 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" event={"ID":"76413d88-7478-4108-8f6f-c63fa89eb825","Type":"ContainerDied","Data":"eb961d32f4b3107a02d0f461f85dbb79eef539acb7d190bdf55bb5bb7215c85a"} Sep 29 20:20:32 crc kubenswrapper[4780]: I0929 20:20:32.011287 4780 scope.go:117] "RemoveContainer" containerID="eb961d32f4b3107a02d0f461f85dbb79eef539acb7d190bdf55bb5bb7215c85a" Sep 29 20:20:32 crc kubenswrapper[4780]: I0929 20:20:32.015476 4780 generic.go:334] "Generic (PLEG): container finished" podID="0cd222f3-96a4-4cb7-975c-a056ee4d0b8d" containerID="7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce" exitCode=0 Sep 29 20:20:32 crc kubenswrapper[4780]: I0929 20:20:32.015545 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7lrf" event={"ID":"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d","Type":"ContainerDied","Data":"7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce"} Sep 29 20:20:32 crc kubenswrapper[4780]: I0929 20:20:32.208614 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cn2gq_must-gather-rqkr8_76413d88-7478-4108-8f6f-c63fa89eb825/gather/0.log" Sep 29 20:20:33 crc kubenswrapper[4780]: I0929 20:20:33.037931 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7lrf" event={"ID":"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d","Type":"ContainerStarted","Data":"e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b"} Sep 29 20:20:33 crc kubenswrapper[4780]: I0929 20:20:33.068512 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g7lrf" podStartSLOduration=2.634568607 podStartE2EDuration="5.06848672s" podCreationTimestamp="2025-09-29 20:20:28 +0000 UTC" firstStartedPulling="2025-09-29 20:20:29.986733712 +0000 UTC m=+5829.935031756" lastFinishedPulling="2025-09-29 20:20:32.420651815 +0000 UTC m=+5832.368949869" observedRunningTime="2025-09-29 20:20:33.058590401 +0000 UTC m=+5833.006888455" watchObservedRunningTime="2025-09-29 20:20:33.06848672 +0000 UTC m=+5833.016784784" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.375334 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jwxj6"] Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.383103 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.394138 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chvzh\" (UniqueName: \"kubernetes.io/projected/f01ebbf6-58da-4d19-9bf1-9fec18836597-kube-api-access-chvzh\") pod \"certified-operators-jwxj6\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.394323 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-utilities\") pod \"certified-operators-jwxj6\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.394372 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-catalog-content\") pod \"certified-operators-jwxj6\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.428866 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwxj6"] Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.495275 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-utilities\") pod \"certified-operators-jwxj6\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.495356 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-catalog-content\") pod \"certified-operators-jwxj6\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.495425 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chvzh\" (UniqueName: \"kubernetes.io/projected/f01ebbf6-58da-4d19-9bf1-9fec18836597-kube-api-access-chvzh\") pod \"certified-operators-jwxj6\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.496410 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-utilities\") pod \"certified-operators-jwxj6\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.496687 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-catalog-content\") pod \"certified-operators-jwxj6\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.527882 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chvzh\" (UniqueName: \"kubernetes.io/projected/f01ebbf6-58da-4d19-9bf1-9fec18836597-kube-api-access-chvzh\") pod \"certified-operators-jwxj6\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:35 crc kubenswrapper[4780]: I0929 20:20:35.709168 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:36 crc kubenswrapper[4780]: I0929 20:20:36.174764 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwxj6"] Sep 29 20:20:36 crc kubenswrapper[4780]: W0929 20:20:36.185379 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf01ebbf6_58da_4d19_9bf1_9fec18836597.slice/crio-ef8a4ea9790e99c8038defd0f7961b331d1447b076e6c4943e4dd2b39ef87be3 WatchSource:0}: Error finding container ef8a4ea9790e99c8038defd0f7961b331d1447b076e6c4943e4dd2b39ef87be3: Status 404 returned error can't find the container with id ef8a4ea9790e99c8038defd0f7961b331d1447b076e6c4943e4dd2b39ef87be3 Sep 29 20:20:37 crc kubenswrapper[4780]: I0929 20:20:37.100032 4780 generic.go:334] "Generic (PLEG): container finished" podID="f01ebbf6-58da-4d19-9bf1-9fec18836597" containerID="25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad" exitCode=0 Sep 29 20:20:37 crc kubenswrapper[4780]: I0929 20:20:37.100095 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwxj6" event={"ID":"f01ebbf6-58da-4d19-9bf1-9fec18836597","Type":"ContainerDied","Data":"25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad"} Sep 29 20:20:37 crc kubenswrapper[4780]: I0929 20:20:37.100373 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwxj6" event={"ID":"f01ebbf6-58da-4d19-9bf1-9fec18836597","Type":"ContainerStarted","Data":"ef8a4ea9790e99c8038defd0f7961b331d1447b076e6c4943e4dd2b39ef87be3"} Sep 29 20:20:38 crc kubenswrapper[4780]: I0929 20:20:38.112379 4780 generic.go:334] "Generic (PLEG): container finished" podID="f01ebbf6-58da-4d19-9bf1-9fec18836597" containerID="8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf" exitCode=0 Sep 29 20:20:38 crc kubenswrapper[4780]: I0929 20:20:38.112455 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwxj6" event={"ID":"f01ebbf6-58da-4d19-9bf1-9fec18836597","Type":"ContainerDied","Data":"8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf"} Sep 29 20:20:38 crc kubenswrapper[4780]: I0929 20:20:38.753742 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:20:38 crc kubenswrapper[4780]: E0929 20:20:38.754161 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:20:38 crc kubenswrapper[4780]: I0929 20:20:38.911642 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:38 crc kubenswrapper[4780]: I0929 20:20:38.912016 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:39 crc kubenswrapper[4780]: I0929 20:20:39.005760 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:39 crc kubenswrapper[4780]: I0929 20:20:39.123763 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwxj6" event={"ID":"f01ebbf6-58da-4d19-9bf1-9fec18836597","Type":"ContainerStarted","Data":"d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec"} Sep 29 20:20:39 crc kubenswrapper[4780]: I0929 20:20:39.149893 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jwxj6" podStartSLOduration=2.583496378 podStartE2EDuration="4.149871298s" podCreationTimestamp="2025-09-29 20:20:35 +0000 UTC" firstStartedPulling="2025-09-29 20:20:37.105933254 +0000 UTC m=+5837.054231338" lastFinishedPulling="2025-09-29 20:20:38.672308174 +0000 UTC m=+5838.620606258" observedRunningTime="2025-09-29 20:20:39.145918586 +0000 UTC m=+5839.094216640" watchObservedRunningTime="2025-09-29 20:20:39.149871298 +0000 UTC m=+5839.098169362" Sep 29 20:20:39 crc kubenswrapper[4780]: I0929 20:20:39.187145 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:40 crc kubenswrapper[4780]: I0929 20:20:40.848696 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cn2gq/must-gather-rqkr8"] Sep 29 20:20:40 crc kubenswrapper[4780]: I0929 20:20:40.849251 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" podUID="76413d88-7478-4108-8f6f-c63fa89eb825" containerName="copy" containerID="cri-o://57208cde0ea680b50e0b86894dae78af7dd9f3318d8b5cc3a73d427dd1824f58" gracePeriod=2 Sep 29 20:20:40 crc kubenswrapper[4780]: I0929 20:20:40.859906 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cn2gq/must-gather-rqkr8"] Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.156395 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cn2gq_must-gather-rqkr8_76413d88-7478-4108-8f6f-c63fa89eb825/copy/0.log" Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.160322 4780 generic.go:334] "Generic (PLEG): container finished" podID="76413d88-7478-4108-8f6f-c63fa89eb825" containerID="57208cde0ea680b50e0b86894dae78af7dd9f3318d8b5cc3a73d427dd1824f58" exitCode=143 Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.276619 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cn2gq_must-gather-rqkr8_76413d88-7478-4108-8f6f-c63fa89eb825/copy/0.log" Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.276967 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.321634 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqd6p\" (UniqueName: \"kubernetes.io/projected/76413d88-7478-4108-8f6f-c63fa89eb825-kube-api-access-fqd6p\") pod \"76413d88-7478-4108-8f6f-c63fa89eb825\" (UID: \"76413d88-7478-4108-8f6f-c63fa89eb825\") " Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.321795 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76413d88-7478-4108-8f6f-c63fa89eb825-must-gather-output\") pod \"76413d88-7478-4108-8f6f-c63fa89eb825\" (UID: \"76413d88-7478-4108-8f6f-c63fa89eb825\") " Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.337507 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76413d88-7478-4108-8f6f-c63fa89eb825-kube-api-access-fqd6p" (OuterVolumeSpecName: "kube-api-access-fqd6p") pod "76413d88-7478-4108-8f6f-c63fa89eb825" (UID: "76413d88-7478-4108-8f6f-c63fa89eb825"). InnerVolumeSpecName "kube-api-access-fqd6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.423116 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqd6p\" (UniqueName: \"kubernetes.io/projected/76413d88-7478-4108-8f6f-c63fa89eb825-kube-api-access-fqd6p\") on node \"crc\" DevicePath \"\"" Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.434375 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76413d88-7478-4108-8f6f-c63fa89eb825-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "76413d88-7478-4108-8f6f-c63fa89eb825" (UID: "76413d88-7478-4108-8f6f-c63fa89eb825"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.528624 4780 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76413d88-7478-4108-8f6f-c63fa89eb825-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 29 20:20:41 crc kubenswrapper[4780]: I0929 20:20:41.554606 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7lrf"] Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.173676 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cn2gq_must-gather-rqkr8_76413d88-7478-4108-8f6f-c63fa89eb825/copy/0.log" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.175073 4780 scope.go:117] "RemoveContainer" containerID="57208cde0ea680b50e0b86894dae78af7dd9f3318d8b5cc3a73d427dd1824f58" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.175079 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cn2gq/must-gather-rqkr8" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.175269 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g7lrf" podUID="0cd222f3-96a4-4cb7-975c-a056ee4d0b8d" containerName="registry-server" containerID="cri-o://e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b" gracePeriod=2 Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.216458 4780 scope.go:117] "RemoveContainer" containerID="eb961d32f4b3107a02d0f461f85dbb79eef539acb7d190bdf55bb5bb7215c85a" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.640940 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.647964 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-utilities\") pod \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.648033 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-catalog-content\") pod \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.648134 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9nwj\" (UniqueName: \"kubernetes.io/projected/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-kube-api-access-w9nwj\") pod \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\" (UID: \"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d\") " Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.649733 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-utilities" (OuterVolumeSpecName: "utilities") pod "0cd222f3-96a4-4cb7-975c-a056ee4d0b8d" (UID: "0cd222f3-96a4-4cb7-975c-a056ee4d0b8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.654000 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-kube-api-access-w9nwj" (OuterVolumeSpecName: "kube-api-access-w9nwj") pod "0cd222f3-96a4-4cb7-975c-a056ee4d0b8d" (UID: "0cd222f3-96a4-4cb7-975c-a056ee4d0b8d"). InnerVolumeSpecName "kube-api-access-w9nwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.730950 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cd222f3-96a4-4cb7-975c-a056ee4d0b8d" (UID: "0cd222f3-96a4-4cb7-975c-a056ee4d0b8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.750326 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.750398 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.750419 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9nwj\" (UniqueName: \"kubernetes.io/projected/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d-kube-api-access-w9nwj\") on node \"crc\" DevicePath \"\"" Sep 29 20:20:42 crc kubenswrapper[4780]: I0929 20:20:42.770117 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76413d88-7478-4108-8f6f-c63fa89eb825" path="/var/lib/kubelet/pods/76413d88-7478-4108-8f6f-c63fa89eb825/volumes" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.191072 4780 generic.go:334] "Generic (PLEG): container finished" podID="0cd222f3-96a4-4cb7-975c-a056ee4d0b8d" containerID="e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b" exitCode=0 Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.191141 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7lrf" event={"ID":"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d","Type":"ContainerDied","Data":"e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b"} Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.191473 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7lrf" event={"ID":"0cd222f3-96a4-4cb7-975c-a056ee4d0b8d","Type":"ContainerDied","Data":"1e4bb5b629e5cf752eafd58dd2e572770871825f45df1f105408b1e0887596a5"} Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.191490 4780 scope.go:117] "RemoveContainer" containerID="e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.191220 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7lrf" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.223703 4780 scope.go:117] "RemoveContainer" containerID="7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.224952 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7lrf"] Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.231079 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g7lrf"] Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.241800 4780 scope.go:117] "RemoveContainer" containerID="04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.309378 4780 scope.go:117] "RemoveContainer" containerID="e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b" Sep 29 20:20:43 crc kubenswrapper[4780]: E0929 20:20:43.310461 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b\": container with ID starting with e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b not found: ID does not exist" containerID="e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.310511 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b"} err="failed to get container status \"e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b\": rpc error: code = NotFound desc = could not find container \"e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b\": container with ID starting with e421a84f5763e0718b87e014b3a8276e570f2b33ad6e42643e55a4796f03986b not found: ID does not exist" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.310547 4780 scope.go:117] "RemoveContainer" containerID="7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce" Sep 29 20:20:43 crc kubenswrapper[4780]: E0929 20:20:43.311090 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce\": container with ID starting with 7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce not found: ID does not exist" containerID="7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.311146 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce"} err="failed to get container status \"7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce\": rpc error: code = NotFound desc = could not find container \"7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce\": container with ID starting with 7e5668433b14b55699883ae939f7fd337241043a64149be07494ce74e776e2ce not found: ID does not exist" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.311173 4780 scope.go:117] "RemoveContainer" containerID="04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911" Sep 29 20:20:43 crc kubenswrapper[4780]: E0929 20:20:43.311553 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911\": container with ID starting with 04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911 not found: ID does not exist" containerID="04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911" Sep 29 20:20:43 crc kubenswrapper[4780]: I0929 20:20:43.311596 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911"} err="failed to get container status \"04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911\": rpc error: code = NotFound desc = could not find container \"04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911\": container with ID starting with 04646526cea96d1bb390ce681e2839f667da3284025d43baa89468636abb6911 not found: ID does not exist" Sep 29 20:20:44 crc kubenswrapper[4780]: I0929 20:20:44.765312 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd222f3-96a4-4cb7-975c-a056ee4d0b8d" path="/var/lib/kubelet/pods/0cd222f3-96a4-4cb7-975c-a056ee4d0b8d/volumes" Sep 29 20:20:45 crc kubenswrapper[4780]: I0929 20:20:45.709977 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:45 crc kubenswrapper[4780]: I0929 20:20:45.710405 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:45 crc kubenswrapper[4780]: I0929 20:20:45.796158 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:46 crc kubenswrapper[4780]: I0929 20:20:46.269231 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:47 crc kubenswrapper[4780]: I0929 20:20:47.355568 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwxj6"] Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.238641 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jwxj6" podUID="f01ebbf6-58da-4d19-9bf1-9fec18836597" containerName="registry-server" containerID="cri-o://d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec" gracePeriod=2 Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.739572 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.879522 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-catalog-content\") pod \"f01ebbf6-58da-4d19-9bf1-9fec18836597\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.879776 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chvzh\" (UniqueName: \"kubernetes.io/projected/f01ebbf6-58da-4d19-9bf1-9fec18836597-kube-api-access-chvzh\") pod \"f01ebbf6-58da-4d19-9bf1-9fec18836597\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.879885 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-utilities\") pod \"f01ebbf6-58da-4d19-9bf1-9fec18836597\" (UID: \"f01ebbf6-58da-4d19-9bf1-9fec18836597\") " Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.880812 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-utilities" (OuterVolumeSpecName: "utilities") pod "f01ebbf6-58da-4d19-9bf1-9fec18836597" (UID: "f01ebbf6-58da-4d19-9bf1-9fec18836597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.886439 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01ebbf6-58da-4d19-9bf1-9fec18836597-kube-api-access-chvzh" (OuterVolumeSpecName: "kube-api-access-chvzh") pod "f01ebbf6-58da-4d19-9bf1-9fec18836597" (UID: "f01ebbf6-58da-4d19-9bf1-9fec18836597"). InnerVolumeSpecName "kube-api-access-chvzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.941853 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f01ebbf6-58da-4d19-9bf1-9fec18836597" (UID: "f01ebbf6-58da-4d19-9bf1-9fec18836597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.982255 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chvzh\" (UniqueName: \"kubernetes.io/projected/f01ebbf6-58da-4d19-9bf1-9fec18836597-kube-api-access-chvzh\") on node \"crc\" DevicePath \"\"" Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.982312 4780 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-utilities\") on node \"crc\" DevicePath \"\"" Sep 29 20:20:48 crc kubenswrapper[4780]: I0929 20:20:48.982331 4780 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01ebbf6-58da-4d19-9bf1-9fec18836597-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.265408 4780 generic.go:334] "Generic (PLEG): container finished" podID="f01ebbf6-58da-4d19-9bf1-9fec18836597" containerID="d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec" exitCode=0 Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.265541 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwxj6" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.265534 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwxj6" event={"ID":"f01ebbf6-58da-4d19-9bf1-9fec18836597","Type":"ContainerDied","Data":"d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec"} Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.265888 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwxj6" event={"ID":"f01ebbf6-58da-4d19-9bf1-9fec18836597","Type":"ContainerDied","Data":"ef8a4ea9790e99c8038defd0f7961b331d1447b076e6c4943e4dd2b39ef87be3"} Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.265922 4780 scope.go:117] "RemoveContainer" containerID="d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.299497 4780 scope.go:117] "RemoveContainer" containerID="8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.320511 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwxj6"] Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.329565 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jwxj6"] Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.331646 4780 scope.go:117] "RemoveContainer" containerID="25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.363885 4780 scope.go:117] "RemoveContainer" containerID="d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec" Sep 29 20:20:49 crc kubenswrapper[4780]: E0929 20:20:49.364502 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec\": container with ID starting with d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec not found: ID does not exist" containerID="d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.364546 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec"} err="failed to get container status \"d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec\": rpc error: code = NotFound desc = could not find container \"d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec\": container with ID starting with d407fd61ab33da621c8678d6b780c705d4848e31ee5714eee167cdb45a202fec not found: ID does not exist" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.364576 4780 scope.go:117] "RemoveContainer" containerID="8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf" Sep 29 20:20:49 crc kubenswrapper[4780]: E0929 20:20:49.365995 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf\": container with ID starting with 8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf not found: ID does not exist" containerID="8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.366115 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf"} err="failed to get container status \"8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf\": rpc error: code = NotFound desc = could not find container \"8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf\": container with ID starting with 8d38be1576fe14ca098f2ba9f04f0ee69277ad64ea77b3ca78e33b4acfbdacbf not found: ID does not exist" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.366171 4780 scope.go:117] "RemoveContainer" containerID="25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad" Sep 29 20:20:49 crc kubenswrapper[4780]: E0929 20:20:49.366569 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad\": container with ID starting with 25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad not found: ID does not exist" containerID="25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad" Sep 29 20:20:49 crc kubenswrapper[4780]: I0929 20:20:49.366599 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad"} err="failed to get container status \"25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad\": rpc error: code = NotFound desc = could not find container \"25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad\": container with ID starting with 25c0995bc2e3816892bff55910fa3bf1bfb7b8e6a927859dd903d2186a3398ad not found: ID does not exist" Sep 29 20:20:50 crc kubenswrapper[4780]: I0929 20:20:50.768648 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f01ebbf6-58da-4d19-9bf1-9fec18836597" path="/var/lib/kubelet/pods/f01ebbf6-58da-4d19-9bf1-9fec18836597/volumes" Sep 29 20:20:51 crc kubenswrapper[4780]: I0929 20:20:51.754820 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:20:51 crc kubenswrapper[4780]: E0929 20:20:51.755324 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:21:03 crc kubenswrapper[4780]: I0929 20:21:03.753247 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:21:03 crc kubenswrapper[4780]: E0929 20:21:03.753937 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:21:18 crc kubenswrapper[4780]: I0929 20:21:18.753363 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:21:18 crc kubenswrapper[4780]: E0929 20:21:18.753921 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:21:32 crc kubenswrapper[4780]: I0929 20:21:32.757071 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:21:32 crc kubenswrapper[4780]: E0929 20:21:32.757665 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:21:44 crc kubenswrapper[4780]: I0929 20:21:44.754115 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:21:44 crc kubenswrapper[4780]: E0929 20:21:44.755331 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db" Sep 29 20:21:58 crc kubenswrapper[4780]: I0929 20:21:58.753918 4780 scope.go:117] "RemoveContainer" containerID="930f1da45a6d14e6a1d383d2fda6970af228b8399f63882bf8fa30ff59e557ba" Sep 29 20:21:58 crc kubenswrapper[4780]: E0929 20:21:58.755167 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jrs9w_openshift-machine-config-operator(67a6d63c-6762-464e-9216-a234506b74db)\"" pod="openshift-machine-config-operator/machine-config-daemon-jrs9w" podUID="67a6d63c-6762-464e-9216-a234506b74db"